Linking pages
- Amazon SageMaker's fifth birthday: Looking back, looking forward - Amazon Science https://www.amazon.science/blog/amazon-sagemakers-fifth-birthday-looking-back-looking-forward 0 comments
- Training large language models on Amazon SageMaker: Best practices | AWS Machine Learning Blog https://aws.amazon.com/blogs/machine-learning/training-large-language-models-on-amazon-sagemaker-best-practices/ 0 comments
Linked pages
- Making DeepSpeed ZeRO run efficiently on more-affordable hardware - Amazon Science https://www.amazon.science/blog/making-deepspeed-zero-run-efficiently-on-more-affordable-hardware 0 comments
- Near-linear scaling of gigantic-model training on AWS - Amazon Science https://www.amazon.science/blog/near-linear-scaling-of-gigantic-model-training-on-aws 0 comments
- [1604.06174] Training Deep Nets with Sublinear Memory Cost https://arxiv.org/abs/1604.06174 0 comments
Related searches:
Search whole site: site:amazon.science
Search title: Scaling to trillion-parameter model training on AWS - Amazon Science
See how to search.