2026

Vessel

Self-hosted control plane for Ollama. Manages local model lifecycle without cloud dependencies — no model traffic leaves the host, no third-party API trust required. Go backend, Svelte 5 frontend, Docker deploy.

GoSvelte 5TypeScriptOllamaDocker

# Context

Lightweight web UI for managing Ollama model deployments locally. Problem: Ollama's CLI is functional but friction-heavy for day-to-day use; cloud-based UIs defeat the point of local inference. Approach: Go backend proxying the Ollama API, Svelte 5 frontend, single Docker container deploy. All model traffic stays on the host — no external API calls, no data leaving the machine.

# Key Impact

  • · No model traffic leaves the host
  • · Single Docker container deployment
  • · No third-party API dependencies