Linking pages
- JAX vs Julia (vs PyTorch) · Patrick Kidger https://kidger.site/thoughts/jax-vs-julia/ 110 comments
- GitHub - google/jax: Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more https://github.com/google/jax 99 comments
- Why another binding library? - nanobind documentation https://nanobind.readthedocs.io/en/latest/why.html 17 comments
- GitHub - adap/flower: Flower: A Friendly Federated Learning Framework https://github.com/adap/flower 13 comments
- skrl (1.0.0-rc.1) https://skrl.readthedocs.io/en/latest/ 8 comments
- An astronomer's introduction to NumPyro | Dan Foreman-Mackey https://dfm.io/posts/intro-to-numpyro/ 3 comments
- GitHub - unifyai/ivy: Unified AI https://github.com/unifyai/ivy 2 comments
- From Imperative to Declarative: Developing Efficient Algorithms for a Post-Moore Era https://eatingentropy.substack.com/p/from-imperative-to-declarative-developing 1 comment
- How to use TensorBoard in JAX & Flax https://www.machinelearningnuggets.com/how-to-use-tensorboard-in-flax/ 0 comments
- GitHub - google/jax: Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more https://opensource.google/projects/jax 0 comments
- GitHub - lululxvi/deepxde: A library for scientific machine learning and physics-informed learning https://github.com/lululxvi/deepxde 0 comments
- GitHub - apple/axlearn https://github.com/apple/axlearn 0 comments
- GitHub - PennyLaneAI/catalyst: A JIT compiler for hybrid quantum programs in PennyLane https://github.com/PennyLaneAI/catalyst 0 comments
- Machine Learning Crash Course for Physicists in Three Easy Chapters — Machine Learning in Three Easy Lessons, for Physicists https://florianmarquardt.github.io/MachineLearningThreeEasyLessons/intro.html 0 comments
Linked pages
- GitHub - google/jax: Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more https://github.com/google/jax 99 comments
- GitHub - HIPS/autograd: Efficiently computes derivatives of numpy code. https://github.com/HIPS/autograd 0 comments
- XLA: Optimizing Compiler for Machine Learning | TensorFlow https://www.tensorflow.org/xla 0 comments
Related searches:
Search whole site: site:jax.readthedocs.io
Search title: JAX: High-Performance Array Computing — JAX documentation
See how to search.