- [R] ChatGLM-6B - an open source 6.2 billion parameter Eng/Chinese bilingual LLM trained on 1T tokens, supplemented by supervised fine-tuning, feedback bootstrap, and RLHF. Runs on consumer grade GPUs https://github.com/THUDM/ChatGLM-6B/blob/main/README_en.md 48 comments machinelearning
- ChatGLM, an open-source, self-hosted dialogue language model and alternative to ChatGPT created by Tsinghua University, can be run with as little as 6GB of GPU memory. https://github.com/THUDM/ChatGLM-6B/blob/main/README_en.md 45 comments selfhosted
Linking pages
- OpenAI Vendor Lock-in: The Ironic Story of How OpenAI Went from Open Source to "Open Your Wallet" | LunaTrace https://www.lunasec.io/docs/blog/openai-not-so-open/ 12 comments
- My 2023 Homelab Setup | Mudkip Mud Sport https://mudkip.me/2024/01/31/My-2023-Homelab-Setup/ 4 comments
- Overcoming LLM Hallucinations Using Retrieval Augmented Generation (RAG) - Unite.AI https://www.unite.ai/overcoming-llm-hallucinations-using-retrieval-augmented-generation-rag/ 1 comment
Linked pages
- Gradio https://gradio.app/ 29 comments
- [2210.02414] GLM-130B: An Open Bilingual Pre-trained Model https://arxiv.org/abs/2210.02414 1 comment
- GitHub - THUDM/GLM-130B: GLM-130B: An Open Bilingual Pre-Trained Model https://github.com/THUDM/GLM-130B 1 comment
- tdm-gcc https://jmeubank.github.io/tdm-gcc/ 1 comment
- GitHub - alibaba/MNN: MNN is a blazing fast, lightweight deep learning framework, battle-tested by business-critical use cases in Alibaba https://github.com/alibaba/MNN 0 comments
- [2210.17323] GPTQ: Accurate Post-Training Quantization for Generative Pre-trained Transformers https://arxiv.org/abs/2210.17323 0 comments
Would you like to stay up to date with Computer science? Checkout Computer science
Weekly.
Related searches:
Search whole site: site:github.com
Search title: ChatGLM-6B/README_en.md at main · THUDM/ChatGLM-6B · GitHub
See how to search.