canonry

ADR 0004: Add a Local LLM Provider via OpenAI-Compatible API

Decision

Add a provider-local adapter that queries any OpenAI-compatible chat completions endpoint (Ollama, LM Studio, llama.cpp, vLLM) instead of a hosted AI API.

Why

Consequences