Hacker News
- Jax – Composable transformations of Python and NumPy programs https://github.com/google/jax 65 comments
- JAX: Numpy with Gradients, GPUs and TPUs https://github.com/google/jax 29 comments
- Could someone explain what google's JAX is/does? https://github.com/google/jax 5 comments learnmachinelearning
Linking pages
- Gemma: Google introduces new state-of-the-art open models https://blog.google/technology/developers/gemma-open-models/ 535 comments
- GitHub - deepmind/alphafold: Open source code for AlphaFold. https://github.com/deepmind/alphafold 315 comments
- GitHub - bentoml/OpenLLM: Operating LLMs in production https://github.com/bentoml/OpenLLM 172 comments
- GitHub - google-deepmind/searchless_chess: Grandmaster-Level Chess Without Search https://github.com/google-deepmind/searchless_chess 168 comments
- Advancements in machine learning for machine learning – Google Research Blog https://blog.research.google/2023/12/advancements-in-machine-learning-for.html 151 comments
- PyTorch 2.0: Our next generation release that is faster, more Pythonic and Dynamic as ever | PyTorch https://pytorch.org/blog/pytorch-2.0-release/ 122 comments
- GitHub - marimo-team/marimo: A reactive notebook for Python — run reproducible experiments, execute as a script, deploy as an app, and version with git. https://github.com/marimo-team/marimo 116 comments
- JAX vs Julia (vs PyTorch) · Patrick Kidger https://kidger.site/thoughts/jax-vs-julia/ 110 comments
- What I learned from looking at 200 machine learning tools https://huyenchip.com/2020/06/22/mlops.html 106 comments
- #034 José Valim reveals Project Nx - Thinking Elixir https://thinkingelixir.com/podcast-episodes/034-jose-valim-reveals-project-nx/ 105 comments
- GitHub - ml-explore/mlx: MLX: An array framework for Apple silicon https://github.com/ml-explore/mlx 90 comments
- 2022 State of Competitive ML — The Downfall of TensorFlow | by Mark Kurtz | Mar, 2023 | Medium https://medium.com/@markurtz/2022-state-of-competitive-ml-the-downfall-of-tensorflow-e2577c499a4d 68 comments
- GitHub - josephmisiti/awesome-machine-learning: A curated list of awesome Machine Learning frameworks, libraries and software. https://github.com/josephmisiti/awesome-machine-learning 58 comments
- Google Research: Themes from 2021 and Beyond – Google AI Blog https://ai.googleblog.com/2022/01/google-research-themes-from-2021-and.html 52 comments
- Google launches PaLM 2, its next-gen large language model | TechCrunch https://techcrunch.com/2023/05/10/google-launches-palm-2-its-next-gen-large-language-model/ 52 comments
- GitHub - google/trax: Trax — Deep Learning with Clear Code and Speed https://github.com/google/trax 50 comments
- Lagrangian Neural Networks https://greydanus.github.io/2020/03/10/lagrangian-nns/ 49 comments
- Supercharged high-resolution ocean simulation with JAX | dionhaefner.github.io https://dionhaefner.github.io/2021/12/supercharged-high-resolution-ocean-simulation-with-jax/ 48 comments
- nx/nx at main · elixir-nx/nx · GitHub https://github.com/elixir-nx/nx/tree/main/nx#readme 46 comments
- A High-Level Technical Overview of Fully Homomorphic Encryption || Math ∩ Programming https://www.jeremykun.com/2024/05/04/fhe-overview/ 45 comments
Linked pages
- CUDA 12.1 Release Notes https://docs.nvidia.com/cuda/cuda-toolkit-release-notes/index.html 610 comments
- JAX Quickstart — JAX documentation https://jax.readthedocs.io/en/latest/notebooks/quickstart.html 143 comments
- Using JAX to accelerate our research https://deepmind.com/blog/article/using-jax-to-accelerate-our-research 14 comments
- CUDA Toolkit 12.1 Downloads | NVIDIA Developer https://developer.nvidia.com/cuda-downloads 5 comments
- Pure function - Wikipedia https://en.wikipedia.org/wiki/Pure_function 2 comments
- What is Windows Subsystem for Linux | Microsoft Learn https://docs.microsoft.com/en-us/windows/wsl/about 2 comments
- JAX: High-Performance Array Computing — JAX documentation https://jax.readthedocs.io/ 1 comment
- Referential transparency - Wikipedia https://en.wikipedia.org/wiki/Referential_transparency 0 comments
- GitHub - deepmind/rlax https://github.com/deepmind/rlax 0 comments
- GitHub - HIPS/autograd: Efficiently computes derivatives of numpy code. https://github.com/HIPS/autograd 0 comments
- GitHub - google/flax: Flax is a neural network library for JAX that is designed for flexibility. https://github.com/google/flax/ 0 comments
- CUDA Deep Neural Network (cuDNN) | NVIDIA Developer https://developer.nvidia.com/cuDNN 0 comments
- The Autodiff Cookbook — JAX documentation https://jax.readthedocs.io/en/latest/notebooks/autodiff_cookbook.html 0 comments
- XLA: Optimizing Compiler for Machine Learning | TensorFlow https://www.tensorflow.org/xla 0 comments
Would you like to stay up to date with Python? Checkout Python
Weekly.
Related searches:
Search whole site: site:github.com
Search title: GitHub - google/jax: Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
See how to search.