Hacker News
- An overview of gradient descent optimization algorithms http://ruder.io/optimizing-gradient-descent/index.html 3 comments
- [D] Importance of square root in denominator for AdaGrad https://www.ruder.io/optimizing-gradient-descent/ 2 comments machinelearning
- Stochastic gradient descent: from noisy gradients in millions of dimensions for neural network training - how to go to 2nd order methods? http://ruder.io/optimizing-gradient-descent/index.html 26 comments math
Linking pages
- Making Your Neural Network Say “I Don’t Know” — Bayesian NNs using Pyro and PyTorch | by Paras Chopra | Towards Data Science https://towardsdatascience.com/making-your-neural-network-say-i-dont-know-bayesian-nns-using-pyro-and-pytorch-b1c24e6ab8cd 46 comments
- ML Resources https://sgfin.github.io/learning-resources/ 21 comments
- GitHub - amitness/learning: A log of things I'm learning https://github.com/amitness/learning 17 comments
- 10 Stochastic Gradient Descent Optimisation Algorithms + Cheatsheet | by Raimi Karim | Towards Data Science https://towardsdatascience.com/10-gradient-descent-optimisation-algorithms-86989510b5e9 15 comments
- Deep Learning Algorithms - The Complete Guide | AI Summer https://theaisummer.com/Deep-Learning-Algorithms/ 12 comments
- Checklist for debugging neural networks | by Cecelia Shao | Towards Data Science https://towardsdatascience.com/checklist-for-debugging-neural-networks-d8b2a9434f21 2 comments
- Kevin Zakka's Blog https://kevinzakka.github.io/2020/02/10/nca/ 1 comment
- Gradient Descent With Adam in Plain C++ « The blog at the bottom of the sea https://blog.demofox.org/2024/02/11/gradient-descent-with-adam-in-plain-c/ 1 comment
- GitHub - amrzv/awesome-colab-notebooks: Collection of google colaboratory notebooks for fast and easy experiments https://github.com/amrzv/awesome-colab-notebooks 0 comments
- 37 Reasons why your Neural Network is not working | by Slav Ivanov | Slav https://blog.slavv.com/37-reasons-why-your-neural-network-is-not-working-4020854bd607 0 comments
- How to build a Neural Net in three lines of math https://medium.freecodecamp.org/how-to-build-a-neural-net-in-three-lines-of-math-a0c42f45c40e 0 comments
- bnomial-archive/questions.md at master · akhildevelops/bnomial-archive · GitHub https://github.com/Enforcer007/bnomial-archive/blob/master/questions.md 0 comments
- Core Modeling at Instagram. At Instagram we have many Machine… | by Thomas Bredillet | Instagram Engineering https://instagram-engineering.com/core-modeling-at-instagram-a51e0158aa48 0 comments
- GANs in computer vision - Improved training with Wasserstein distance, game theory control and progressively growing schemes | AI Summer https://theaisummer.com/gan-computer-vision-incremental-training/ 0 comments
- A Gentle Intro to Transfer Learning | by Slav Ivanov | Slav https://blog.slavv.com/a-gentle-intro-to-transfer-learning-2c0b674375a0 0 comments
- Understanding objective functions in neural networks. | by Lars Hulstaert | Towards Data Science https://towardsdatascience.com/understanding-objective-functions-in-neural-networks-d217cb068138 0 comments
- Gradient descent vs. neuroevolution | by Lars Hulstaert | Towards Data Science https://towardsdatascience.com/gradient-descent-vs-neuroevolution-f907dace010f?gi=5fe59397195b 0 comments
- Fathoming the Deep in Deep Learning – A Practical Approach – thoughtful impressions https://avantlive.wordpress.com/2019/04/29/fathoming-the-deep-in-deep-learning-a-practical-approach/ 0 comments
- The Ultimate Guide to Linear Regression · Learning With Data https://learningwithdata.com/posts/tylerfolkman/the-ultimate-guide-to-linear-regression/ 0 comments
- Fathoming ‘the Deep’ in Deep Learning — A Practical Approach | by Nachi Nachiappan | Towards Data Science https://towardsdatascience.com/fathoming-the-deep-in-deep-learning-a-practical-approach-c18adab31cbe 0 comments
Would you like to stay up to date with Computer science? Checkout Computer science
Weekly.
Related searches:
Search whole site: site:ruder.io
Search title: An overview of gradient descent optimization algorithms
See how to search.