Hacker News
- Jax – Composable transformations of Python and NumPy programs https://github.com/google/jax 65 comments
- JAX: Numpy with Gradients, GPUs and TPUs https://github.com/google/jax 29 comments
- Could someone explain what google's JAX is/does? https://github.com/google/jax 5 comments learnmachinelearning
Linking pages
- GitHub - deepmind/alphafold: Open source code for AlphaFold. https://github.com/deepmind/alphafold 315 comments
- PyTorch 2.0: Our next generation release that is faster, more Pythonic and Dynamic as ever | PyTorch https://pytorch.org/blog/pytorch-2.0-release/ 122 comments
- JAX vs Julia (vs PyTorch) · Patrick Kidger https://kidger.site/thoughts/jax-vs-julia/ 110 comments
- What I learned from looking at 200 machine learning tools https://huyenchip.com/2020/06/22/mlops.html 106 comments
- #034 José Valim reveals Project Nx - Thinking Elixir https://thinkingelixir.com/podcast-episodes/034-jose-valim-reveals-project-nx/ 105 comments
- 2022 State of Competitive ML — The Downfall of TensorFlow | by Mark Kurtz | Mar, 2023 | Medium https://medium.com/@markurtz/2022-state-of-competitive-ml-the-downfall-of-tensorflow-e2577c499a4d 68 comments
- GitHub - josephmisiti/awesome-machine-learning: A curated list of awesome Machine Learning frameworks, libraries and software. https://github.com/josephmisiti/awesome-machine-learning 58 comments
- Google Research: Themes from 2021 and Beyond – Google AI Blog https://ai.googleblog.com/2022/01/google-research-themes-from-2021-and.html 52 comments
- Google launches PaLM 2, its next-gen large language model | TechCrunch https://techcrunch.com/2023/05/10/google-launches-palm-2-its-next-gen-large-language-model/ 52 comments
- GitHub - google/trax: Trax — Deep Learning with Clear Code and Speed https://github.com/google/trax 50 comments
- Lagrangian Neural Networks https://greydanus.github.io/2020/03/10/lagrangian-nns/ 49 comments
- Supercharged high-resolution ocean simulation with JAX | dionhaefner.github.io https://dionhaefner.github.io/2021/12/supercharged-high-resolution-ocean-simulation-with-jax/ 48 comments
- nx/nx at main · elixir-nx/nx · GitHub https://github.com/elixir-nx/nx/tree/main/nx#readme 46 comments
- Reformer: The Efficient Transformer – Google AI Blog https://ai.googleblog.com/2020/01/reformer-efficient-transformer.html 44 comments
- Beyond automatic differentiation – Google AI Blog https://ai.googleblog.com/2023/04/beyond-automatic-differentiation.html 34 comments
- Nx (Numerical Elixir) is now publicly available - Dashbit Blog https://dashbit.co/blog/nx-numerical-elixir-is-now-publicly-available 31 comments
- GitHub - breandan/kotlingrad: 🧩 Shape-Safe Symbolic Differentiation with Algebraic Data Types https://github.com/breandan/kotlingrad 27 comments
- GitHub - huggingface/transformers: 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX. https://github.com/huggingface/transformers 26 comments
- GitHub - cleverhans-lab/cleverhans: An adversarial example library for constructing attacks, building defenses, and benchmarking both https://github.com/cleverhans-lab/cleverhans 23 comments
- Google Research: Looking Back at 2019, and Forward to 2020 and Beyond – Google AI Blog https://ai.googleblog.com/2020/01/google-research-looking-back-at-2019.html 18 comments
Linked pages
- CUDA 12.1 Release Notes https://docs.nvidia.com/cuda/cuda-toolkit-release-notes/index.html 610 comments
- Using JAX to accelerate our research https://deepmind.com/blog/article/using-jax-to-accelerate-our-research 14 comments
- CUDA Toolkit 12.1 Downloads | NVIDIA Developer https://developer.nvidia.com/cuda-downloads 5 comments
- Pure function - Wikipedia https://en.wikipedia.org/wiki/Pure_function 2 comments
- What is Windows Subsystem for Linux | Microsoft Learn https://docs.microsoft.com/en-us/windows/wsl/about 2 comments
- Referential transparency - Wikipedia https://en.wikipedia.org/wiki/Referential_transparency 0 comments
- GitHub - deepmind/rlax https://github.com/deepmind/rlax 0 comments
- GitHub - HIPS/autograd: Efficiently computes derivatives of numpy code. https://github.com/HIPS/autograd 0 comments
- GitHub - google/flax: Flax is a neural network library for JAX that is designed for flexibility. https://github.com/google/flax/ 0 comments
- CUDA Deep Neural Network (cuDNN) | NVIDIA Developer https://developer.nvidia.com/cuDNN 0 comments
- The Autodiff Cookbook — JAX documentation https://jax.readthedocs.io/en/latest/notebooks/autodiff_cookbook.html 0 comments
- XLA: Optimizing Compiler for Machine Learning | TensorFlow https://www.tensorflow.org/xla 0 comments
Would you like to stay up to date with Python? Checkout Python Weekly.
Related searches:
Search whole site: site:github.com
Search title: GitHub - google/jax: Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
See how to search.