Linking pages
Linked pages
- Stability AI Launches the First of its StableLM Suite of Language Models — Stability AI https://stability.ai/blog/stability-ai-launches-the-first-of-its-stablelm-suite-of-language-models 501 comments
- [2205.01068] OPT: Open Pre-trained Transformer Language Models https://arxiv.org/abs/2205.01068 318 comments
- Open Assistant https://open-assistant.io/ 311 comments
- Stanford CRFM https://crfm.stanford.edu/2023/03/13/alpaca.html 298 comments
- Introducing ChatGPT https://openai.com/blog/chatgpt/ 296 comments
- Anthropic \ Claude 2 https://www.anthropic.com/index/claude-2 292 comments
- [2005.14165] Language Models are Few-Shot Learners https://arxiv.org/abs/2005.14165 201 comments
- [2305.13048] RWKV: Reinventing RNNs for the Transformer Era https://arxiv.org/abs/2305.13048 171 comments
- Releasing 3B and 7B RedPajama-INCITE family of models including base, instruction-tuned & chat models — TOGETHER https://www.together.xyz/blog/redpajama-models-v1 154 comments
- GitHub - kingoflolz/mesh-transformer-jax: Model parallel transformers in JAX and Haiku https://github.com/kingoflolz/mesh-transformer-jax 146 comments
- [2305.14314] QLoRA: Efficient Finetuning of Quantized LLMs https://arxiv.org/abs/2305.14314 129 comments
- [2310.06825] Mistral 7B https://arxiv.org/abs/2310.06825 124 comments
- Falcon LLM - Home https://falconllm.tii.ae/ 87 comments
- GPT-J-6B: 6B JAX-Based Transformer – Aran Komatsuzaki https://arankomatsuzaki.wordpress.com/2021/06/04/gpt-j/ 79 comments
- Free Dolly: Introducing the World's First Open and Commercially Viable Instruction-Tuned LLM - The Databricks Blog https://www.databricks.com/blog/2023/04/12/dolly-first-open-commercially-viable-instruction-tuned-llm 54 comments
- [2304.03208] Cerebras-GPT: Open Compute-Optimal Language Models Trained on the Cerebras Wafer-Scale Cluster https://arxiv.org/abs/2304.03208 51 comments
- GitHub - salesforce/ctrl: Conditional Transformer Language Model for Controllable Generation https://github.com/salesforce/ctrl 40 comments
- [2305.10403] PaLM 2 Technical Report https://arxiv.org/abs/2305.10403 36 comments
- [1810.04805] BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding https://arxiv.org/abs/1810.04805 25 comments
- https://huggingface.co/WizardLM/WizardLM-13B-V1.2 25 comments
Related searches:
Search whole site: site:promptingguide.ai
Search title: LLM Collection | Prompt Engineering Guide
See how to search.