Hacker News
- Practical Tips for Finetuning LLMs Using LoRA (Low-Rank Adaptation) https://magazine.sebastianraschka.com/p/practical-tips-for-finetuning-llms 27 comments
- [P] Practical Tips for Finetuning LLMs Using LoRA (Low-Rank Adaptation): Things I Learned From Hundreds of Experiments https://magazine.sebastianraschka.com/p/practical-tips-for-finetuning-llms 10 comments machinelearning
Linking pages
- 10 Noteworthy AI Research Papers of 2023 https://magazine.sebastianraschka.com/p/10-ai-research-papers-2023 24 comments
- Improving LoRA: Implementing Weight-Decomposed Low-Rank Adaptation (DoRA) from Scratch https://magazine.sebastianraschka.com/p/lora-and-dora-from-scratch 10 comments
- How to Fine-Tune LLMs in 2024 with Hugging Face https://www.philschmid.de/fine-tune-llms-in-2024-with-trl 0 comments
Linked pages
- [2305.14314] QLoRA: Efficient Finetuning of Quantized LLMs https://arxiv.org/abs/2305.14314 129 comments
- [2305.11206] LIMA: Less Is More for Alignment https://arxiv.org/abs/2305.11206 44 comments
- Finetuning LLMs with LoRA and QLoRA: Insights from Hundreds of Experiments - Lightning AI https://lightning.ai/pages/community/lora-insights/ 39 comments
- LLM Training: RLHF and Its Alternatives https://magazine.sebastianraschka.com/p/llm-training-rlhf-and-its-alternatives 14 comments
- [2106.09685] LoRA: Low-Rank Adaptation of Large Language Models https://arxiv.org/abs/2106.09685 8 comments
- [2305.14342] Sophia: A Scalable Stochastic Second-order Optimizer for Language Model Pre-training https://arxiv.org/abs/2305.14342 8 comments
- Ahead of AI #9: LLM Tuning & Dataset Perspectives https://magazine.sebastianraschka.com/p/ahead-of-ai-9-llm-tuning-and-dataset 4 comments
- GitHub - tatsu-lab/stanford_alpaca https://github.com/tatsu-lab/stanford_alpaca 2 comments
Would you like to stay up to date with Computer science? Checkout Computer science
Weekly.
Related searches:
Search whole site: site:magazine.sebastianraschka.com
Search title: Practical Tips for Finetuning LLMs Using LoRA (Low-Rank Adaptation)
See how to search.