Skip to content

Local LLM

Run local LLM inference server

Notes

  • The server is intended to be openai-compatible.

Commands

ts
import { commands } from "@hypr/plugin-local-llm";

Resources