llama-cpp-capacitor

llama-cpp-capacitor JS library on GitHub llama-cpp-capacitor JS library on npm Download llama-cpp-capacitor JS library

A native Capacitor plugin that embeds llama.cpp directly into mobile apps, enabling offline AI inference with chat-first API design. Complete iOS and Android support: text generation, chat, multimodal, TTS, LoRA, embeddings, and more.

Version 0.1.5 License MIT
llama-cpp-capacitor has no homepage
llama-cpp-capacitor JS library on GitHub
llama-cpp-capacitor JS library on npm
Download llama-cpp-capacitor JS library
Keywords
capacitorpluginnativeiosandroidllamallama.cppaimachine-learningoffline-aitext-generationmultimodalttstext-to-speechlora