Linking pages
- MIT Open-Sources a Toolkit for Editing Classifiers by Directly Rewriting Their Prediction Rules | Synced https://syncedreview.com/2021/12/10/deepmind-podracer-tpu-based-rl-frameworks-deliver-exceptional-performance-at-low-cost-163/ 1 comment
- Google Proposes a ‘Simple Trick’ for Dramatically Reducing Transformers’ (Self-)Attention Memory Requirements | Synced https://syncedreview.com/2021/12/14/deepmind-podracer-tpu-based-rl-frameworks-deliver-exceptional-performance-at-low-cost-165/ 0 comments
Linked pages
- [2112.04426] Improving language models by retrieving from trillions of tokens https://arxiv.org/abs/2112.04426 9 comments
- MIT Open-Sources a Toolkit for Editing Classifiers by Directly Rewriting Their Prediction Rules | Synced https://syncedreview.com/2021/12/10/deepmind-podracer-tpu-based-rl-frameworks-deliver-exceptional-performance-at-low-cost-163/ 1 comment
- Research | Synced https://syncedreview.com/category/technology/ 0 comments
- Google Proposes a ‘Simple Trick’ for Dramatically Reducing Transformers’ (Self-)Attention Memory Requirements | Synced https://syncedreview.com/2021/12/14/deepmind-podracer-tpu-based-rl-frameworks-deliver-exceptional-performance-at-low-cost-165/ 0 comments