Linking pages
Linked pages
- Attention is All you Need https://papers.nips.cc/paper/7181-attention-is-all-you-need 30 comments
- The Illustrated Transformer – Jay Alammar – Visualizing machine learning one concept at a time. https://jalammar.github.io/illustrated-transformer/ 25 comments
- GitHub - google-research/bert: TensorFlow code and pre-trained models for BERT https://github.com/google-research/bert 21 comments
- The Illustrated BERT, ELMo, and co. (How NLP Cracked Transfer Learning) – Jay Alammar – Visualizing machine learning one concept at a time. https://jalammar.github.io/illustrated-bert/ 20 comments
- [1810.04805] BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding https://arxiv.org/abs/1810.04805 18 comments
- Visualizing A Neural Machine Translation Model (Mechanics of Seq2seq Models With Attention) – Jay Alammar – Visualizing machine learning one concept at a time. https://jalammar.github.io/visualizing-neural-machine-translation-mechanics-of-seq2seq-models-with-attention/ 1 comment
- Open Sourcing BERT: State-of-the-Art Pre-training for Natural Language Processing – Google AI Blog https://ai.googleblog.com/2018/11/open-sourcing-bert-state-of-art-pre.html 0 comments
Related searches:
Search whole site: site:mccormickml.com
Search title: BERT Research - Ep. 1 - Key Concepts & Sources · Chris McCormick
See how to search.