Apache Airflow
Apache Airflow is an open-source platform for developing, scheduling, and monitoring batch-oriented workflows. It uses Python to author DAGs (Directed Acyclic Graphs) that represent workflows, and provides a rich web UI for managing and observing pipelines.
Apache Airflow
In Development
This script is currently in active development and may be unstable or incomplete. Use in production environments is not recommended.
This script is currently in active development and may be unstable or incomplete. Use in production environments is not recommended.
Installation
Default install:
1
bash -c "$(curl -fsSL https://raw.githubusercontent.com/community-scripts/ProxmoxVED/main/ct/airflow.sh)"
Default Credentials
| Username | Password |
|---|---|
admin | None |
Notes
The initial admin password is randomly generated and stored in /opt/airflow/.env (AIRFLOW_ADMIN_PASSWORD).
Place your DAG files in /opt/airflow/dags/. The scheduler will pick them up automatically.
This installs Airflow with LocalExecutor. For distributed task execution, configure CeleryExecutor manually.