Open WebUI

AI Chat & Assistants

A self-hosted AI chat UI compatible with Ollama and OpenAI-compatible APIs. Capable of running fully offline.

4.4
Webセルフホスティング(Docker)

What is Open WebUI?

Open WebUI is an open-source, self-hosted AI chat platform. It supports a wide range of LLM runners including Ollama, OpenAI-compatible APIs, and more, and can operate completely offline. It offers advanced features such as RAG (Retrieval-Augmented Generation), voice and video calls, document processing, and Python tool calling. It supports 9 vector databases including ChromaDB, PostgreSQL, Qdrant, and Milvus, and includes prompt injection protection via LLM-Guard. It is one of the most popular self-hosted AI tools of 2026 for individuals and organizations that prioritize privacy.

Pricing Plans

1Free (open source)

Key Features

Multi-model support
RAG
Voice & video calls
Document processing
Prompt injection protection
9 vector database support

Pros & Cons

Pros

  • Completely free and open source
  • Works offline
  • Excellent privacy protection

Cons

  • Requires technical knowledge such as Docker for setup
  • Japanese localization of the UI is incomplete
  • Requires your own GPU/server

Frequently Asked Questions

Q. Is Open WebUI free?

A. Yes, it is completely open source and free. However, you need a server or PC (possibly with a GPU) to run it.

Q. What is the difference from Ollama?

A. Ollama is an engine for running LLMs locally, while Open WebUI is its frontend (user interface). By connecting Open WebUI to Ollama, you can use local AI through a ChatGPT-like UI.

Related Tools