Linking pages
- Hardware for Deep Learning. Part 4: ASIC | by Grigory Sapunov | Intento https://blog.inten.to/hardware-for-deep-learning-part-4-asic-96a542fe6a81 8 comments
- The Best Tools for Reinforcement Learning in Python You Actually Want to Try - neptune.ai https://neptune.ai/blog/the-best-tools-for-reinforcement-learning-in-python 0 comments
Linked pages
- GitHub - kingoflolz/mesh-transformer-jax: Model parallel transformers in JAX and Haiku https://github.com/kingoflolz/mesh-transformer-jax 146 comments
- JAX Quickstart — JAX documentation https://jax.readthedocs.io/en/latest/notebooks/quickstart.html 143 comments
- GPT-J-6B: 6B JAX-Based Transformer – Aran Komatsuzaki https://arankomatsuzaki.wordpress.com/2021/06/04/gpt-j/ 79 comments
- GitHub - google/trax: Trax — Deep Learning with Clear Code and Speed https://github.com/google/trax 50 comments
- Using JAX to accelerate our research https://deepmind.com/blog/article/using-jax-to-accelerate-our-research 14 comments
- [1910.00177] Advantage-Weighted Regression: Simple and Scalable Off-Policy Reinforcement Learning https://arxiv.org/abs/1910.00177 5 comments
- GitHub - google/objax https://github.com/google/objax 5 comments
- Google wins MLPerf benchmark contest with fastest ML training supercomputer | Google Cloud Blog https://cloud.google.com/blog/products/ai-machine-learning/google-breaks-ai-performance-records-in-mlperf-with-worlds-fastest-training-supercomputer 1 comment
- [1502.05767] Automatic differentiation in machine learning: a survey https://arxiv.org/abs/1502.05767 1 comment
- GitHub - dionhaefner/pyhpc-benchmarks: A suite of benchmarks for CPU and GPU performance of the most popular high-performance libraries for Python https://github.com/dionhaefner/pyhpc-benchmarks 1 comment
- GitHub - poets-ai/elegy: A High Level API for Deep Learning in JAX https://github.com/poets-ai/elegy 1 comment
- GitHub - jax-md/jax-md: Differentiable, Hardware Accelerated, Molecular Dynamics https://github.com/google/jax-md 1 comment
- trax/trax/models/reformer at master · google/trax · GitHub https://github.com/google/trax/tree/master/trax/models/reformer 1 comment
- EleutherAI https://www.eleuther.ai/ 0 comments
- GitHub - deepmind/rlax https://github.com/deepmind/rlax 0 comments
- From PyTorch to JAX: towards neural net frameworks that purify stateful code — Sabrina J. Mielke https://sjmielke.com/jax-purify.htm 0 comments
- GitHub - HIPS/autograd: Efficiently computes derivatives of numpy code. https://github.com/HIPS/autograd 0 comments
- GitHub - google/flax: Flax is a neural network library for JAX that is designed for flexibility. https://github.com/google/flax/ 0 comments
- GitHub - tensorflow/tensor2tensor: Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research. https://github.com/tensorflow/tensor2tensor 0 comments
- The Autodiff Cookbook — JAX documentation https://jax.readthedocs.io/en/latest/notebooks/autodiff_cookbook.html 0 comments
Related searches:
Search whole site: site:medium.com
Search title: JAX Ecosystem. JAX by Google Research is getting more… | by Grigory Sapunov | Medium
See how to search.