Hacker News
- RedPajama 7B (an Apache 2.0-licensed LLaMa) is now available https://www.together.xyz/blog/redpajama-7b 25 comments
- Together’s $20M seed funding to build open-source AI and cloud platform https://www.together.xyz/blog/seed-funding 22 comments
- Releasing 3B and 7B RedPajama https://www.together.xyz/blog/redpajama-models-v1 106 comments
- RedPajama at 440B tokens higher quality than Pythia and StableLM https://www.together.xyz/blog/redpajama-training-progress 2 comments
- RedPajama: Reproduction of LLaMA with friendly license https://www.together.xyz/blog/redpajama 216 comments
- OpenChatKit – Fully open-source model, data and more https://www.together.xyz/blog/openchatkit 5 comments
- Releasing v1 of GPT-JT, fork of GPT-6B fine-tuned on 3.53B tokens https://www.together.xyz/research/releasing-v1-of-gpt-jt-powered-by-open-source-ai 17 comments
- RedPajama 7B now available, instruct model outperforms all open 7B models on HELM benchmarks https://www.together.xyz/blog/redpajama-7b 16 comments machinelearningnews
- [P] The first RedPajama models are here! The 3B and 7B models are now available under Apache 2.0, including instruction-tuned and chat versions. These models aim replicate LLaMA as closely as possible. https://www.together.xyz/blog/redpajama-models-v1 48 comments machinelearning