Inspect and manage Apache Airflow DAGs, runs, and logs across instances.
Airflow MCP Server
Human entrypoint for running and using the Apache Airflow MCP server. This server exposes safe, focused tools to inspect Airflow DAGs, runs, and logs (with optional write operations gated by client approval). Responses are structured JSON objects (dicts) and include a request_id for traceability.
Quickstart
1) Install the server (PyPI)
Install with uv so you are exercising the exact bits that ship to users and get reproducible virtualenvs:
uv tool install apache-airflow-mcp-server
No uv? Fall back to pip:
pip install apache-airflow-mcp-server
2) Configure instances (required)
Set AIRFLOW_MCP_INSTANCES_FILE to a YAML file listing available Airflow instances. Values may reference environment variables using ${VAR} syntax. Missing variables cause startup errors.
Example (examples/instances.yaml):
# Data team staging instance
data-stg:
host: https://airflow.data-stg.example.com/
api_version: v1
verify_ssl: true
auth:
type: basic
username: ${AIRFLOW_INSTANCE_DATA_STG_USERNAME}
password: ${AIRFLOW_INSTANCE_DATA_STG_PASSWORD}
# ML team staging instance
ml-stg:
host: https://airflow.ml-stg.example.com/
api_version: v1
verify_ssl: true
auth:
type: basic
username: ${AIRFLOW_INSTANCE_ML_STG_USERNAME}
password: ${AIRFLOW_INSTANCE_ML_STG_PASSWORD}
# Bearer token (experimental)
# ml-prod:
# host: https://airflow.ml-prod.example.com/
# api_version: v1
# verify_ssl: true
# auth:
# type: bearer
# token: ${AIRFLOW_INSTANCE_ML_PROD_TOKEN}
Bearer token auth is experimental; basic auth remains the primary, well-tested path.
Kubernetes deployment tip: Provide instances.yaml via a Secret and mount it at /config/instances.yaml (set AIRFLOW_MCP_INSTANCES_FILE=/config/instances.yaml).
Environment variables:
AIRFLOW_MCP_INSTANCES_FILE(required): path to registry YAMLAIRFLOW_MCP_DEFAULT_INSTANCE(optional): default instance keyAIRFLOW_MCP_HTTP_HOST(default: 127.0.0.1)AIRFLOW_MCP_HTTP_PORT(default: 8765)AIRFLOW_MCP_TIMEOUT_SECONDS(default: 30)AIRFLOW_MCP_LOG_FILE(optional)AIRFLOW_MCP_HTTP_BLOCK_GET_ON_MCP(default: true)
3) Run the server
- HTTP (recommended for tooling):
uv run airflow-mcp --transport http --host 127.0.0.1 --port 8765
- STDIO (CLI/terminal workflows):
uv run airflow-mcp --transport stdio
Health check (HTTP): GET /health → 200 OK.
Tip: A fastmcp.json is included for discovery/config by FastMCP tooling:
{
"$schema": "https://gofastmcp.com/schemas/fastmcp_config/v1.json",
"entrypoint": { "file": "src/airflow_mcp/server.py", "object": "mcp" },
"deployment": { "transport": "http", "host": "127.0.0.1", "port": 8765 }
}
4) Typical incident workflow
Start from an Airflow UI URL (often in a Datadog alert):
airflow_resolve_url(url)→ resolveinstance,dag_id,dag_run_id,task_id.airflow_list_dag_runs(instance|ui_url, dag_id)→ confirm recent state.airflow_get_task_instance(instance|ui_url, dag_id, dag_run_id, task_id, include_rendered?, max_rendered_bytes?)→ inspect task metadata, attempts, and optional rendered fields.airflow_get_task_instance_logs(instance|ui_url, dag_id, dag_run_id, task_id, try_number, filter_level?, context_lines?, tail_lines?, max_bytes?)→ inspect failure with optional filtering and truncation.
All tools accept either instance or ui_url. If both are given and disagree, the call fails with INSTANCE_MISMATCH. ui_url must be a fully qualified http(s) Airflow URL; use airflow_list_instances() to discover valid hosts when you only have an instance key.
Tool Reference (Structured JSON)
Discovery and URL utilities:
airflow_list_instances()→ list configured instance keys and defaultairflow_describe_instance(instance)→ host, api_version, verify_ssl,auth_type(redacted)airflow_resolve_url(url)→ resolve instance and identifiers from an Airflow UI URL
Read-only tools:
Tools (6)
airflow_list_instancesList configured instance keys and the default instance.airflow_describe_instanceGet details about a specific Airflow instance.airflow_resolve_urlResolve instance and identifiers from an Airflow UI URL.airflow_list_dag_runsConfirm recent state of DAG runs.airflow_get_task_instanceInspect task metadata, attempts, and optional rendered fields.airflow_get_task_instance_logsInspect task failure logs with optional filtering and truncation.Environment Variables
AIRFLOW_MCP_INSTANCES_FILErequiredPath to the registry YAML file containing Airflow instance configurations.AIRFLOW_MCP_DEFAULT_INSTANCEThe default instance key to use if none is specified.AIRFLOW_MCP_HTTP_HOSTHost address for the HTTP server.AIRFLOW_MCP_HTTP_PORTPort for the HTTP server.AIRFLOW_MCP_TIMEOUT_SECONDSTimeout in seconds for API requests.AIRFLOW_MCP_LOG_FILEPath to a log file for server output.AIRFLOW_MCP_HTTP_BLOCK_GET_ON_MCPWhether to block GET requests on the MCP transport.Configuration
{"mcpServers": {"airflow": {"command": "uv", "args": ["run", "airflow-mcp", "--transport", "stdio"], "env": {"AIRFLOW_MCP_INSTANCES_FILE": "/path/to/instances.yaml"}}}}