Hacker News
- JetMoE: Reaching LLaMA2 performance with 0.1M dollars https://research.myshell.ai/jetmoe 90 comments
Linking pages
- Myshell AI and MIT Researchers Propose JetMoE-8B: A Super-Efficient LLM Model that Achieves LLaMA2-Level Training with Just US $0.1M - MarkTechPost https://www.marktechpost.com/2024/04/05/myshell-ai-and-mit-researchers-propose-jetmoe-8b-a-super-efficient-llm-model-that-achieves-llama2-level-training-with-just-us-0-1m/ 1 comment
- GitHub - myshell-ai/JetMoE: Reaching LLaMA2 Performance with 0.1M Dollars https://github.com/myshell-ai/JetMoE 0 comments
Linked pages
- Build AI The Simple Way | Lepton AI https://www.lepton.ai/ 26 comments
- [2306.04640] ModuleFormer: Modularity Emerges from Mixture-of-Experts https://arxiv.org/abs/2306.04640 1 comment
- GitHub - myshell-ai/JetMoE: Reaching LLaMA2 Performance with 0.1M Dollars https://github.com/myshell-ai/JetMoE 0 comments
Related searches:
Search whole site: site:research.myshell.ai
Search title: JetMoE
See how to search.