Hacker News
- Attention and Augmented Recurrent Neural Networks https://distill.pub/2016/augmented-rnns/ 4 comments
- Attention and Augmented Recurrent Neural Networks http://distill.pub/2016/augmented-rnns/ 5 comments
Linking pages
- Why Tool AIs Want to Be Agent AIs · Gwern.net http://www.gwern.net/Tool%20AI 58 comments
- Transformers are Graph Neural Networks https://thegradient.pub/transformers-are-graph-neural-networks/ 25 comments
- ML Resources https://sgfin.github.io/learning-resources/ 21 comments
- Transformers are Graph Neural Networks | NTU Graph Deep Learning Lab https://graphdeeplearning.github.io/post/transformers-are-gnns/ 19 comments
- GitHub - andrewt3000/DL4NLP: Deep Learning for NLP resources https://github.com/andrewt3000/DL4NLP 14 comments
- Taming Recurrent Neural Networks for Better Summarization | Abigail See http://www.abigailsee.com/2017/04/16/taming-rnns-for-better-summarization.html 8 comments
- The Google Brain Team — Looking Back on 2017 (Part 1 of 2) – Google AI Blog https://research.googleblog.com/2018/01/the-google-brain-team-looking-back-on.html 6 comments
- Memory, attention, sequences. We have seen the rise and success of… | by Eugenio Culurciello | Towards Data Science https://towardsdatascience.com/memory-attention-sequences-37456d271992 3 comments
- GitHub - guillaume-chevalier/Awesome-Deep-Learning-Resources: Rough list of my favorite deep learning resources, useful for revisiting topics or for reference. I have got through all of the content listed there, carefully. - Guillaume Chevalier https://github.com/guillaume-chevalier/awesome-deep-learning-resources 1 comment
- Why Tool AIs Want to Be Agent AIs · Gwern.net https://www.gwern.net/Tool-AI 1 comment
- Attn: Illustrated Attention. Attention illustrated in GIFs and how… | by Raimi Karim | Towards Data Science https://towardsdatascience.com/attn-illustrated-attention-5ec4ad276ee3 1 comment
- GitHub - guillaume-chevalier/PyTorch-Dynamic-RNN-Attention-Decoder-Tree: This is code I wrote within less than an hour so as to very roughly draft how I would code a Dynamic RNN Attention Decoder Tree with PyTorch. https://github.com/guillaume-chevalier/PyTorch-Dynamic-RNN-Attention-Decoder-Tree 1 comment
- How to Visualize Your Recurrent Neural Network with Attention in Keras | by Zafarali Ahmed | Datalogue | Medium https://medium.com/datalogue/attention-in-keras-1892773a4f22 1 comment
- An Introduction to Deep Learning for Generative Models http://www.ymer.org/amir/2016/11/21/an-introduction-to-deep-learning-for-generative-models/ 0 comments
- NIPS 2017 — Day 1 Highlights. — Emmanuel Ameisen, Ben Regner, Jeremy… | by Emmanuel Ameisen | Insight https://blog.insightdatascience.com/nips-2017-day-1-highlights-6aa124c5a2c7 0 comments
- Fine-Tuning BERT for multiclass categorisation with Amazon SageMaker – Grinding Gears https://engineering.freeagent.com/2021/09/15/fine-tuning-bert-for-multiclass-categorisation-with-amazon-sagemaker/ 0 comments
- Attention in Neural Networks and How to Use It http://akosiorek.github.io/ml/2017/10/14/visual-attention.html 0 comments
- Introducing tf-seq2seq: An Open Source Sequence-to-Sequence Framework in TensorFlow – Google AI Blog https://research.googleblog.com/2017/04/introducing-tf-seq2seq-open-source.html 0 comments
- Aman's AI Journal • Primers • Transformers https://aman.ai/primers/ai/transformers/ 0 comments
- Over 150 of the Best Machine Learning, NLP, and Python Tutorials I’ve Found | by Robbie Allen | Machine Learning in Practice | Medium https://medium.com/machine-learning-in-practice/over-150-of-the-best-machine-learning-nlp-and-python-tutorials-ive-found-ffce2939bd78 0 comments
Related searches:
Search whole site: site:distill.pub
Search title: Attention and Augmented Recurrent Neural Networks
See how to search.