+
Skip to content
#

lm

Here are 79 public repositories matching this topic...

ModuleFormer is a MoE-based architecture that includes two different types of experts: stick-breaking attention heads and feedforward experts. We released a collection of ModuleFormer-based Language Models (MoLM) ranging in scale from 4 billion to 8 billion parameters.

  • Updated Apr 10, 2024
  • Python

Improve this page

Add a description, image, and links to the lm topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the lm topic, visit your repo's landing page and select "manage topics."

Learn more

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载