Linking pages
- Fast and Portable Llama2 Inference on the Heterogeneous Edge https://www.secondstate.io/articles/fast-llm-inference/ 98 comments
- GitHub - WasmEdge/WasmEdge: WasmEdge is a lightweight, high-performance, and extensible WebAssembly runtime for cloud native, edge, and decentralized applications. It powers serverless apps, embedded functions, microservices, smart contracts, and IoT devices. https://github.com/WasmEdge/WasmEdge 33 comments
Linked pages
- LlamaIndex - Data Framework for LLM Applications https://www.llamaindex.ai/ 55 comments
- GitHub - WasmEdge/WasmEdge: WasmEdge is a lightweight, high-performance, and extensible WebAssembly runtime for cloud native, edge, and decentralized applications. It powers serverless apps, embedded functions, microservices, smart contracts, and IoT devices. https://github.com/WasmEdge/WasmEdge 33 comments
- Flows.network https://flows.network/ 3 comments
- LangChain https://langchain.com/ 0 comments
Related searches:
Search whole site: site:secondstate.io
Search title: Wasm as the runtime for LLMs and AGI
See how to search.