Hacker News
- JAX – NumPy on the CPU, GPU, and TPU https://jax.readthedocs.io/en/latest/notebooks/quickstart.html 143 comments
Linking pages
- Differentiable Programming from Scratch – Max Slater – Computer Graphics, Programming, and Math https://thenumb.at/Autodiff/ 107 comments
- GitHub - google/jax: Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more https://github.com/google/jax 99 comments
- How JIT Compilers are Implemented and Fast: Pypy, LuaJIT, Graal and More | kipply's blog https://carolchen.me/blog/jits-impls/ 53 comments
- An astronomer's introduction to NumPyro | Dan Foreman-Mackey https://dfm.io/posts/intro-to-numpyro/ 3 comments
- How to solve the two language problem? https://scientificcoder.com/how-to-solve-the-two-language-problem 3 comments
- How JIT Compilers are Implemented and Fast: Pypy, LuaJIT, Graal and More | kipply's blog https://kipp.ly/jits-impls/ 2 comments
- GitHub - mlegls/hyjax: Hy bindings for JAX https://github.com/mlegls/hyjax 1 comment
- GitHub - pybamm-team/PyBaMM: Fast and flexible physics-based battery models in Python https://github.com/pybamm-team/PyBaMM 1 comment
- The Next Generation of Machine Learning Tools | Roman Ring http://inoryy.com/post/next-gen-ml-tools/ 0 comments
- How to Read the News like a Bayesian — Count Bayesie https://www.countbayesie.com/blog/2022/2/19/how-to-read-the-news-like-a-bayesian 0 comments
- EvoJAX: A Great Framework For Most Deep Tasks | by Reza Yazdanfar | Medium https://medium.com/mlearning-ai/evojax-a-great-framework-for-most-deep-tasks-10adf685c152 0 comments
- GitHub - google/jax: Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more https://opensource.google/projects/jax 0 comments
- How JIT Compilers are Implemented and Fast: Pypy, LuaJIT, Graal and More | kipply's blog https://kipp.ly/blog/jits-impls/ 0 comments
- JAX Ecosystem. JAX by Google Research is getting more… | by Grigory Sapunov | Medium https://moocaholic.medium.com/jax-a13e83f49897 0 comments
- Baselines for Uncertainty and Robustness in Deep Learning – Google AI Blog https://ai.googleblog.com/2021/10/baselines-for-uncertainty-and.html 0 comments
- MCMC in JAX with benchmarks: 3 ways to write a sampler https://www.jeremiecoullon.com/2020/11/10/mcmcjax3ways/ 0 comments
- GitHub - facebookresearch/torchdim: Named tensors with first-class dimensions for PyTorch https://github.com/facebookresearch/torchdim 0 comments
- GitHub - alexOarga/haiku-geometric: A collection of graph neural networks implementations in JAX https://github.com/alexOarga/haiku-geometric 0 comments
Linked pages
- Automatic differentiation - Wikipedia https://en.wikipedia.org/wiki/Automatic_differentiation 86 comments
- GitHub - HIPS/autograd: Efficiently computes derivatives of numpy code. https://github.com/HIPS/autograd 0 comments
- XLA: Optimizing Compiler for Machine Learning | TensorFlow https://www.tensorflow.org/xla 0 comments
Related searches:
Search whole site: site:jax.readthedocs.io
Search title: JAX Quickstart — JAX documentation
See how to search.