Linking pages
- Google Research: Looking Back at 2019, and Forward to 2020 and Beyond – Google AI Blog https://ai.googleblog.com/2020/01/google-research-looking-back-at-2019.html 18 comments
- Building Production Machine Learning Systems | by Manu Suryavansh | Heartbeat https://heartbeat.fritz.ai/building-production-machine-learning-systems-7eda2fda0cdf 0 comments
- Speeding Up Neural Network Training with Data Echoing – Google AI Blog https://ai.googleblog.com/2020/05/speeding-up-neural-network-training.html 0 comments
Linked pages
- Common Crawl https://commoncrawl.org/ 85 comments
- Understanding LSTM Networks -- colah's blog https://colah.github.io/posts/2015-08-Understanding-LSTMs/ 64 comments
- ImageNet http://image-net.org/index 12 comments
- Introducing GPipe, an Open Source Library for Efficiently Training Large-scale Neural Network Models – Google AI Blog http://ai.googleblog.com/2019/03/introducing-gpipe-open-source-library.html 12 comments
- CIFAR-10 and CIFAR-100 datasets https://www.cs.toronto.edu/~kriz/cifar.html 6 comments
- [1512.03385] Deep Residual Learning for Image Recognition http://arxiv.org/abs/1512.03385 6 comments
- Transformer: A Novel Neural Network Architecture for Language Understanding – Google AI Blog https://ai.googleblog.com/2017/08/transformer-novel-neural-network.html 3 comments
- Data parallelism - Wikipedia https://en.wikipedia.org/wiki/Data_parallelism 0 comments
- Train and run machine learning models faster | Cloud TPU | Google Cloud https://cloud.google.com/tpu/ 0 comments
- [1811.03600] Measuring the Effects of Data Parallelism on Neural Network Training https://arxiv.org/abs/1811.03600 0 comments
Related searches:
Search whole site: site:ai.googleblog.com
Search title: Measuring the Limits of Data Parallel Training for Neural Networks – Google AI Blog
See how to search.