- [Discussion] Is RETRO inference faster than GPT-3? https://deepmind.com/research/publications/2021/improving-language-models-by-retrieving-from-trillions-of-tokens 5 comments machinelearning
Linking pages
- The Illustrated Retrieval Transformer – Jay Alammar – Visualizing machine learning one concept at a time. http://jalammar.github.io/illustrated-retrieval-transformer/ 55 comments
- GitHub - lucidrains/x-transformers: A simple but complete full-attention transformer with a set of promising experimental features from various papers https://github.com/lucidrains/x-transformers 40 comments
- GPT-3, Foundation Models, and AI Nationalism https://lastweekin.ai/p/gpt-3-foundation-models-and-ai-nationalism 1 comment
- Retrieval Transformers for Medicine - by Arsham G https://arsham.substack.com/p/retrieval-transformers-for-medicine 0 comments
- The State of Machine Learning in 8 Papers — February, 2022 | by Sergi Castella i Sapé | Heartbeat https://heartbeat.comet.ml/the-state-of-machine-learning-in-8-papers-february-2022-4cf0293f1b6?gi=3f78497257a1 0 comments
- GitHub - tomohideshibata/BERT-related-papers: BERT-related papers https://github.com/tomohideshibata/BERT-related-papers 0 comments
Would you like to stay up to date with Computer science? Checkout Computer science
Weekly.
Related searches:
Search whole site: site:deepmind.com
Search title: Improving language models by retrieving from trillions of tokens
See how to search.