LocalAGI
LocalAGI is a self-hostable AI agent platform with a web UI, OpenAI-compatible APIs, and local-first model orchestration.
LocalAGI
In Development
This script is currently in active development and may be unstable or incomplete. Use in production environments is not recommended.
This script is currently in active development and may be unstable or incomplete. Use in production environments is not recommended.
Installation
Default install:
1
bash -c "$(wget -qLO - https://github.com/community-scripts/ProxmoxVED/raw/main/ct/localagi.sh)"
Configuration
Config file:
1
/opt/localagi/.env
Notes
This script builds LocalAGI from source (Go + Bun) and runs it as a systemd service.
This Proxmox script runs LocalAGI in external-backend mode and does not provision local ROCm/NVIDIA runtimes.
By default, LocalAGI is configured to call an OpenAI-compatible backend at
http://127.0.0.1:11434/v1 (Ollama-compatible) via LOCALAGI_LLM_API_URL.To use an external Ollama host, edit
/opt/localagi/.env and set LOCALAGI_LLM_API_URL=http://:11434/v1</code>, then restart LocalAGI with systemctl restart localagi.</div> </div> ## Web Interface## Links - [Official Website](https://github.com/mudler/LocalAGI) - [Documentation](https://github.com/mudler/LocalAGI#installation-options) --- This post is licensed under CC BY 4.0 by the author.