Set Up Open WebUI with Ollama: Your Own ChatGPT
Open WebUI is a self-hosted ChatGPT alternative. Pair it with Ollama for a fully local, private AI chat.
Prerequisites
- Ollama installed and running (
ollama run llama3.2at least once) - Docker (or use the pip install)
Docker install (recommended)
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui ghcr.io/open-webui/open-webui:main
Open http://localhost:3000. Create an admin account.
Connect Ollama
In Open WebUI settings, add Ollama as a connection. URL: http://host.docker.internal:11434 (or http://localhost:11434 if not using Docker). Save. You'll see your local models in the model selector.
Start chatting
Select a model (e.g., Llama 3.2), type your prompt. Everything runs locally. No data leaves your machine.
Optional: Add OpenAI for hybrid use
You can add your OpenAI API key too. Use local models for sensitive tasks, cloud for heavy lifting. Open WebUI lets you switch per conversation.
Discussion
Sign in to comment. Your account must be at least 1 day old.