Linking pages
- Accelerating Neural Networks on Mobile and Web with Sparse Inference – Google AI Blog https://ai.googleblog.com/2021/03/accelerating-neural-networks-on-mobile.html 1 comment
- Google Research: Looking Back at 2020, and Forward to 2021 – Google AI Blog https://ai.googleblog.com/2021/01/google-research-looking-back-at-2020.html 0 comments
Linked pages
- TensorFlow http://tensorflow.org/ 440 comments
- [1803.03635] The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks https://arxiv.org/abs/1803.03635 32 comments
- [1506.02626] Learning both Weights and Connections for Efficient Neural Networks https://arxiv.org/abs/1506.02626 6 comments
- CIFAR-10 and CIFAR-100 datasets https://www.cs.toronto.edu/~kriz/cifar.html 6 comments
- [1902.09574] The State of Sparsity in Deep Neural Networks https://arxiv.org/abs/1902.09574 3 comments
- [1911.09723] Fast Sparse ConvNets https://arxiv.org/abs/1911.09723 1 comment
- [1502.01852] Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification http://arxiv.org/abs/1502.01852 0 comments
- [2006.10901] Sparse GPU Kernels for Deep Learning https://arxiv.org/abs/2006.10901 0 comments
- [1907.04840] Sparse Networks from Scratch: Faster Training without Losing Performance https://arxiv.org/abs/1907.04840 0 comments
- Scalable training of artificial neural networks with adaptive sparse connectivity inspired by network science | Nature Communications https://www.nature.com/articles/s41467-018-04316-3 0 comments
Related searches:
Search whole site: site:ai.googleblog.com
Search title: Improving Sparse Training with RigL – Google AI Blog
See how to search.