Linking pages
- State-space LLMs: Do we need Attention? https://www.interconnects.ai/p/llms-beyond-attention 1 comment
- GitHub - HazyResearch/aisys-building-blocks: Building blocks for foundation models. https://github.com/HazyResearch/aisys-building-blocks 1 comment
- The Four Wars of the AI Stack (Dec 2023 Recap) https://www.latent.space/p/dec-2023 0 comments
- The Four Wars of the AI Stack (Dec 2023 Recap) https://www.latent.space/i/140396949/mixtral-sparks-a-gpuinference-war 0 comments
- GitHub - sustcsonglin/flash-linear-attention: Efficient implementations of state-of-the-art linear attention models in Pytorch and Triton https://github.com/sustcsonglin/flash-linear-attention 0 comments
Related searches:
Search whole site: site:hazyresearch.stanford.edu
Search title: Zoology (Blogpost 2): Simple, Input-Dependent, and Sub-Quadratic Sequence Mixers · Hazy Research
See how to search.