Linking pages
Linked pages
- [1609.08144] Google's Neural Machine Translation System: Bridging the Gap between Human and Machine Translation http://arxiv.org/abs/1609.08144 97 comments
- NLP's ImageNet moment has arrived https://thegradient.pub/nlp-imagenet/ 42 comments
- [1810.04805] BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding https://arxiv.org/abs/1810.04805 25 comments
- GitHub - google-research/bert: TensorFlow code and pre-trained models for BERT https://github.com/google-research/bert 21 comments
- Neural Coreference â Hugging Face https://huggingface.co/coref/ 5 comments
- [1711.10160] Snorkel: Rapid Training Data Creation with Weak Supervision https://arxiv.org/abs/1711.10160 1 comment
- [1801.06146] Universal Language Model Fine-tuning for Text Classification https://arxiv.org/abs/1801.06146 0 comments
- Building NLP Classifiers Cheaply With Transfer Learning and Weak Supervision | by Abraham Starosta | Sculpt | Medium https://towardsdatascience.com/a-technique-for-building-nlp-classifiers-efficiently-with-transfer-learning-and-weak-supervision-a8e2f21ca9c8 0 comments
- GitHub - huggingface/transformers: 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX. https://github.com/huggingface/pytorch-pretrained-BERT 0 comments
- GLUE Benchmark https://gluebenchmark.com/leaderboard/ 0 comments
- An Overview of Multi-Task Learning for Deep Learning http://ruder.io/multi-task/ 0 comments
- https://s3-us-west-2.amazonaws.com/openai-assets/research-covers/language-unsupervised/language_understanding_paper.pdf 0 comments
Related searches:
Search whole site: site:dawn.cs.stanford.edu
Search title: Massive Multi-Task Learning with Snorkel MeTaL: Bringing More Supervision to Bear · Stanford DAWN
See how to search.