Post

Open WebUI

OpenWebUI is a self-hosted, web-based interface that allows you to run AI models entirely offline. It integrates with various LLM runners, such as OpenAI and Ollama, and supports features like markdown and LaTeX rendering, model management, and voice/video calls. It also offers multilingual support and the ability to generate images using APIs like DALL-E or ComfyUI

Open WebUI

Installation

Default install:

1
bash -c "$(wget -qLO - https://github.com/community-scripts/ProxmoxVE/raw/main/ct/openwebui.sh)"
CPU: 4 cores RAM: 8192 MB Disk: 50 GB OS: Debian 13

Configuration

Config file:

1
/root/.env

Notes

Script contains optional installation of Ollama.
Initial run of the application/container can take some time, depending on your host speed, as the application is installed/updated at runtime. Please be patient!

Web Interface

Port: 8080

This post is licensed under CC BY 4.0 by the author.