@wllama/wllama
esm
WebAssembly binding for llama.cpp - Enabling on-browser LLM inference
Version 2.3.7 License MIT
Keywords
wasmwebassemblyllamallmairagembeddingsgeneration
INSTALL
WebAssembly binding for llama.cpp - Enabling on-browser LLM inference