Hacker News
Linking pages
- Cerebras-GPT: A Family of Open, Compute-efficient, Large Language Models - Cerebras https://www.cerebras.net/blog/cerebras-gpt-a-family-of-open-compute-efficient-large-language-models/ 234 comments
- Cerebras Unveils Andromeda, a 13.5 Million Core AI Supercomputer that Delivers Near-Perfect Linear Scaling for Large Language Models - Cerebras https://www.cerebras.net/press-release/cerebras-unveils-andromeda-a-13.5-million-core-ai-supercomputer-that-delivers-near-perfect-linear-scaling-for-large-language-models 12 comments
- Cerebras Reveals Andromeda, a 13.5 Million Core AI Supercomputer | Tom's Hardware https://www.tomshardware.com/news/cerebras-reveals-andromeda-a-135-million-core-ai-supercomputer 0 comments
- The 13.5 Million Core Computer | Hackaday https://hackaday.com/2022/11/23/the-13-5-million-core-computer/ 0 comments
- Cerebras Systems Releases Seven New GPT Models Trained on CS-2 Wafer-Scale Systems - Cerebras https://www.cerebras.net/press-release/cerebras-systems-releases-seven-new-gpt-models-trained-on-cs-2-wafer-scale-systems 0 comments
Linked pages
- Cerebras Unveils Andromeda, a 13.5 Million Core AI Supercomputer that Delivers Near-Perfect Linear Scaling for Large Language Models - Cerebras https://www.cerebras.net/press-release/cerebras-unveils-andromeda-a-13.5-million-core-ai-supercomputer-that-delivers-near-perfect-linear-scaling-for-large-language-models 12 comments
- Product - Chip - Cerebras https://www.cerebras.net/product-chip/ 2 comments
Related searches:
Search whole site: site:cerebras.net
Search title: Andromeda - Cerebras
See how to search.