inference-server

esm
Libraries and server to build AI applications. Adapters to various native bindings allowing local inference. Integrate it with your application, or use as a microservice.
Version 1.0.0-beta.31 License MIT
Keywords
local aiinference servermodel poolgpt4allnode-llama-cpptransformers.jsllama.cppchatbotbotllmainlpopenai api
INSTALL