Linking pages
- 2019 — Year of BERT and Transformer | by Manu Suryavansh | Towards Data Science https://towardsdatascience.com/2019-year-of-bert-and-transformer-f200b53d05b9?sk=77913662dd96ce5de77998341504902f&source=friends_link 0 comments
- NLP Year in Review — 2019. NLP highlights for the year 2019. | by elvis | DAIR.AI | Medium https://medium.com/dair-ai/nlp-year-in-review-2019-fb8d523bcb19 0 comments
Linked pages
- Hugging Face – The AI community building the future. https://huggingface.co/ 57 comments
- spaCy · Industrial-strength Natural Language Processing in Python https://spacy.io/ 36 comments
- GitHub - huggingface/transformers: 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX. https://github.com/huggingface/transformers 26 comments
- [1810.04805] BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding https://arxiv.org/abs/1810.04805 25 comments
- GitHub - google-research/bert: TensorFlow code and pre-trained models for BERT https://github.com/google-research/bert 21 comments
- Prodigy · Prodigy · An annotation tool for AI, Machine Learning & NLP https://prodi.gy 13 comments
- [1904.12848] Unsupervised Data Augmentation for Consistency Training https://arxiv.org/abs/1904.12848 2 comments
- [1711.03953] Breaking the Softmax Bottleneck: A High-Rank RNN Language Model https://arxiv.org/abs/1711.03953 0 comments
- GitHub - explosion/thinc: 🔮 A refreshing functional take on deep learning, compatible with your favorite libraries https://github.com/explosion/thinc 0 comments
- [1906.04341] What Does BERT Look At? An Analysis of BERT's Attention https://arxiv.org/abs/1906.04341 0 comments
- https://arxiv.org/abs/1906.02243 0 comments
- [1903.05987] To Tune or Not to Tune? Adapting Pretrained Representations to Diverse Tasks https://arxiv.org/abs/1903.05987 0 comments
- GitHub - google/sentencepiece: Unsupervised text tokenizer for Neural Network-based text generation. https://github.com/google/sentencepiece 0 comments
Related searches:
Search whole site: site:explosion.ai
Search title: spaCy meets Transformers: Fine-tune BERT, XLNet and GPT-2 · Explosion
See how to search.