Background

Local LLM Interface

Next.jsTypeScriptTailwind CSSOllamaVERCELVERCEL AI SDK

Clean, minimal local-first chat interface for running LLMs with Ollama. Designed for privacy, speed, and simplicity, it lets you chat with local models in a modern UI, with optional OpenAI-powered previews for demos.