This Docker Compose configuration sets up Open WebUI with Tailscale as a sidecar container to keep the app reachable over your Tailnet.
Open WebUI is a feature-rich, self-hosted AI platform that provides a ChatGPT-style interface for local and cloud-based AI models. It supports Ollama and any OpenAI-compatible API. Pairing it with Tailscale means your private AI interface is securely accessible from any of your devices without exposing it to the public internet.
In this setup, the tailscale-open-webui service runs Tailscale, which manages secure networking for Open WebUI. The open-webui service utilizes the Tailscale network stack via Docker's network_mode: service: configuration. This keeps the app Tailnet-only unless you intentionally expose ports.
- Prerequisites: Docker and Docker Compose installed. No special group membership, GPU, or devices required for CPU-only inference. A Tailscale account with an auth key from https://tailscale.com/admin/authkeys.
- Volumes: Pre-create
./open-webui-databefore deploying to avoid Docker creating a root-owned directory:mkdir -p ./open-webui-data ./config ./ts/state - MagicDNS/Serve: Enable MagicDNS and HTTPS in your Tailscale admin console before deploying. The serve config proxies to port
8080— this is hardcoded in theconfigsblock and does not consume.envvalues. UncommentTS_ACCEPT_DNS=trueincompose.yamlif DNS resolution issues arise. - Ollama: Set
OLLAMA_BASE_URLin.envto point at your Ollama instance. Options:- Same Docker host:
http://host.docker.internal:11434 - LAN machine:
http://<local-ip>:11434(use the private IP of the machine running Ollama) - Another Tailnet device:
http://100.x.x.x:11434 - Leave blank to configure a different provider (e.g. OpenAI) via the UI after first launch.
- Same Docker host:
- Ports: The
0.0.0.0:${SERVICEPORT}:${SERVICEPORT}mapping is commented out by default. Uncomment only if LAN access is required alongside Tailnet access. - Gotchas:
- Create your admin account immediately after first launch — Open WebUI is open to registration until the first user is created.
- Open WebUI requires WebSocket support — ensure nothing in your network path blocks WebSocket connections.
- After adding new models to Ollama, refresh the model list in Open WebUI via Settings → Connections.
Please check the following contents for validity as some variables need to be defined upfront.
.env// Main variables:TS_AUTHKEY,SERVICE,IMAGE_URL,OLLAMA_BASE_URL,WEBUI_SECRET_KEY