@wllama/wllama

esm
WebAssembly binding for llama.cpp - Enabling on-browser LLM inference
Version 2.3.1 License MIT
Keywords
wasmwebassemblyllamallmairagembeddingsgeneration
INSTALL
WebAssembly binding for llama.cpp - Enabling on-browser LLM inference