Post

Apache Airflow

Apache Airflow is an open-source platform for developing, scheduling, and monitoring batch-oriented workflows. It uses Python to author DAGs (Directed Acyclic Graphs) that represent workflows, and provides a rich web UI for managing and observing pipelines.

Apache Airflow
In Development
This script is currently in active development and may be unstable or incomplete. Use in production environments is not recommended.

Installation

Default install:

1
bash -c "$(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVED/main/ct/airflow.sh)"
CPU: 2 cores RAM: 4096 MB Disk: 16 GB OS: Debian 13

Default Credentials

UsernamePassword
adminNone

Notes

The initial admin password is randomly generated and stored in /opt/airflow/.env (AIRFLOW_ADMIN_PASSWORD).
Place your DAG files in /opt/airflow/dags/. The scheduler will pick them up automatically.
This installs Airflow with LocalExecutor. For distributed task execution, configure CeleryExecutor manually.

Web Interface

Port: 8080

Source code
Loading...