← local-ai

Open WebUI

Self-hosted Open WebUI on the homelab cluster, fronting a LiteLLM proxy that routes to local Ollama instances and a Hetzner-hosted fallback. No data leaves Switzerland.

The goal is a ChatGPT-equivalent for personal and client use where the conversation never touches an external API. Swiss data residency is a real constraint for some client contexts — not theoretical.

Setup is still being documented. The routing logic between local and fallback is the first thing worth writing up.

No entries yet — check back soon.