- [P] rwkv.cpp: FP16 & INT4 inference on CPU for RWKV language model https://github.com/saharNooby/rwkv.cpp 26 comments machinelearning
Linking pages
- 🦅 Eagle 7B : Soaring past Transformers with 1 Trillion Tokens Across 100+ Languages https://blog.rwkv.com/p/eagle-7b-soaring-past-transformers 81 comments
- GitHub - go-skynet/LocalAI: :robot: Self-hosted, community-driven, local OpenAI compatible API. Drop-in replacement for OpenAI running LLMs on consumer-grade hardware. Free Open Source OpenAI alternative. No GPU required. Runs ggml, gguf, GPTQ, onnx, TF compatible models: llama, llama2, gpt4all, rwkv, whisper, vicuna, koala, cerebras, falcon, dolly, starcoder, and many others https://github.com/go-skynet/LocalAI 23 comments
- 🦅 EagleX 1.7T : Soaring past LLaMA 7B 2T in both English and Multi-lang evals (RWKV-v5) https://substack.recursal.ai/p/eaglex-17t-soaring-past-llama-7b 9 comments
- State of LLaMA 2023/Q1. Here’s a mind map for AI/ML ChatGPT… | by katopz | Apr, 2023 | Better Programming https://betterprogramming.pub/state-of-llama-2023-q1-663905c37a5e 0 comments
- GitHub - mudler/LocalAI: :robot: The free, Open Source OpenAI alternative. Self-hosted, community-driven and local-first. Drop-in replacement for OpenAI running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. It allows to generate Text, Audio, Video, Images. Also with voice cloning capabilities. https://github.com/mudler/LocalAI 0 comments
- 🦅 EagleX v2 : Soaring past LLaMA2 7B in both English and Multi-lang evals (RWKV-v5) https://blog.rwkv.com/p/eaglex-v2-soaring-past-llama2-7b 0 comments
Linked pages
- GitHub - BlinkDL/RWKV-LM: RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding. https://github.com/BlinkDL/RWKV-LM 179 comments
- Git for Windows https://gitforwindows.org/ 54 comments
- Download | CMake https://cmake.org/download/ 13 comments
- Start Locally | PyTorch https://pytorch.org/get-started/locally/ 3 comments
Would you like to stay up to date with Computer science? Checkout Computer science
Weekly.
Related searches:
Search whole site: site:github.com
Search title: GitHub - saharNooby/rwkv.cpp: INT4 and FP16 inference on CPU for RWKV language model
See how to search.