Apache Airflow MCP Server

1

Add it to Claude Code

Run this in a terminal.

Run in terminal
claude mcp add -e "AIRFLOW_MCP_INSTANCES_FILE=${AIRFLOW_MCP_INSTANCES_FILE}" apache-airflow -- uv run airflow-mcp --transport stdio
Required:AIRFLOW_MCP_INSTANCES_FILE+ 6 optional
README.md

Inspect and manage Apache Airflow DAGs, runs, and logs across instances.

Airflow MCP Server

Human entrypoint for running and using the Apache Airflow MCP server. This server exposes safe, focused tools to inspect Airflow DAGs, runs, and logs (with optional write operations gated by client approval). Responses are structured JSON objects (dicts) and include a request_id for traceability.

Quickstart

1) Install the server (PyPI)

Install with uv so you are exercising the exact bits that ship to users and get reproducible virtualenvs:

uv tool install apache-airflow-mcp-server

No uv? Fall back to pip:

pip install apache-airflow-mcp-server

2) Configure instances (required)

Set AIRFLOW_MCP_INSTANCES_FILE to a YAML file listing available Airflow instances. Values may reference environment variables using ${VAR} syntax. Missing variables cause startup errors.

Example (examples/instances.yaml):

# Data team staging instance
data-stg:
  host: https://airflow.data-stg.example.com/
  api_version: v1
  verify_ssl: true
  auth:
    type: basic
    username: ${AIRFLOW_INSTANCE_DATA_STG_USERNAME}
    password: ${AIRFLOW_INSTANCE_DATA_STG_PASSWORD}

# ML team staging instance
ml-stg:
  host: https://airflow.ml-stg.example.com/
  api_version: v1
  verify_ssl: true
  auth:
    type: basic
    username: ${AIRFLOW_INSTANCE_ML_STG_USERNAME}
    password: ${AIRFLOW_INSTANCE_ML_STG_PASSWORD}

# Bearer token (experimental)
# ml-prod:
#   host: https://airflow.ml-prod.example.com/
#   api_version: v1
#   verify_ssl: true
#   auth:
#     type: bearer
#     token: ${AIRFLOW_INSTANCE_ML_PROD_TOKEN}

Bearer token auth is experimental; basic auth remains the primary, well-tested path.

Kubernetes deployment tip: Provide instances.yaml via a Secret and mount it at /config/instances.yaml (set AIRFLOW_MCP_INSTANCES_FILE=/config/instances.yaml).

Environment variables:

  • AIRFLOW_MCP_INSTANCES_FILE (required): path to registry YAML
  • AIRFLOW_MCP_DEFAULT_INSTANCE (optional): default instance key
  • AIRFLOW_MCP_HTTP_HOST (default: 127.0.0.1)
  • AIRFLOW_MCP_HTTP_PORT (default: 8765)
  • AIRFLOW_MCP_TIMEOUT_SECONDS (default: 30)
  • AIRFLOW_MCP_LOG_FILE (optional)
  • AIRFLOW_MCP_HTTP_BLOCK_GET_ON_MCP (default: true)

3) Run the server

  • HTTP (recommended for tooling):
uv run airflow-mcp --transport http --host 127.0.0.1 --port 8765
  • STDIO (CLI/terminal workflows):
uv run airflow-mcp --transport stdio

Health check (HTTP): GET /health200 OK.

Tip: A fastmcp.json is included for discovery/config by FastMCP tooling:

{
  "$schema": "https://gofastmcp.com/schemas/fastmcp_config/v1.json",
  "entrypoint": { "file": "src/airflow_mcp/server.py", "object": "mcp" },
  "deployment": { "transport": "http", "host": "127.0.0.1", "port": 8765 }
}

4) Typical incident workflow

Start from an Airflow UI URL (often in a Datadog alert):

  1. airflow_resolve_url(url) → resolve instance, dag_id, dag_run_id, task_id.
  2. airflow_list_dag_runs(instance|ui_url, dag_id) → confirm recent state.
  3. airflow_get_task_instance(instance|ui_url, dag_id, dag_run_id, task_id, include_rendered?, max_rendered_bytes?) → inspect task metadata, attempts, and optional rendered fields.
  4. airflow_get_task_instance_logs(instance|ui_url, dag_id, dag_run_id, task_id, try_number, filter_level?, context_lines?, tail_lines?, max_bytes?) → inspect failure with optional filtering and truncation.

All tools accept either instance or ui_url. If both are given and disagree, the call fails with INSTANCE_MISMATCH. ui_url must be a fully qualified http(s) Airflow URL; use airflow_list_instances() to discover valid hosts when you only have an instance key.

Tool Reference (Structured JSON)

Discovery and URL utilities:

  • airflow_list_instances() → list configured instance keys and default
  • airflow_describe_instance(instance) → host, api_version, verify_ssl, auth_type (redacted)
  • airflow_resolve_url(url) → resolve instance and identifiers from an Airflow UI URL

Read-only tools:

Tools (6)

airflow_list_instancesList configured instance keys and the default instance.
airflow_describe_instanceGet details about a specific Airflow instance.
airflow_resolve_urlResolve instance and identifiers from an Airflow UI URL.
airflow_list_dag_runsConfirm recent state of DAG runs.
airflow_get_task_instanceInspect task metadata, attempts, and optional rendered fields.
airflow_get_task_instance_logsInspect task failure logs with optional filtering and truncation.

Environment Variables

AIRFLOW_MCP_INSTANCES_FILErequiredPath to the registry YAML file containing Airflow instance configurations.
AIRFLOW_MCP_DEFAULT_INSTANCEThe default instance key to use if none is specified.
AIRFLOW_MCP_HTTP_HOSTHost address for the HTTP server.
AIRFLOW_MCP_HTTP_PORTPort for the HTTP server.
AIRFLOW_MCP_TIMEOUT_SECONDSTimeout in seconds for API requests.
AIRFLOW_MCP_LOG_FILEPath to a log file for server output.
AIRFLOW_MCP_HTTP_BLOCK_GET_ON_MCPWhether to block GET requests on the MCP transport.

Configuration

claude_desktop_config.json
{"mcpServers": {"airflow": {"command": "uv", "args": ["run", "airflow-mcp", "--transport", "stdio"], "env": {"AIRFLOW_MCP_INSTANCES_FILE": "/path/to/instances.yaml"}}}}

Try it

List all configured Airflow instances and tell me which one is the default.
Resolve this Airflow UI URL and check the status of the most recent DAG run: [URL]
Get the logs for the failed task 'process_data' in the latest run of the 'daily_etl' DAG.
Inspect the metadata and rendered fields for the task 'load_to_db' in the specified DAG run.

Frequently Asked Questions

What are the key features of Apache Airflow MCP Server?

Inspect Airflow DAGs, runs, and task instances.. Retrieve and filter task execution logs.. Resolve Airflow UI URLs to internal instance identifiers.. Support for multiple Airflow instances via YAML configuration.. Gated write operations for triggering DAGs or clearing task instances..

What can I use Apache Airflow MCP Server for?

Quickly debugging failed data pipelines by fetching logs directly in the chat interface.. Monitoring the status of critical production DAGs without switching context to the Airflow UI.. Resolving Datadog alert URLs to specific task instances to investigate failures.. Managing multiple Airflow environments (staging vs production) from a single AI assistant..

How do I install Apache Airflow MCP Server?

Install Apache Airflow MCP Server by running: uv tool install apache-airflow-mcp-server

What MCP clients work with Apache Airflow MCP Server?

Apache Airflow MCP Server works with any MCP-compatible client including Claude Desktop, Claude Code, Cursor, and other editors with MCP support.

Turn this server into reusable context

Keep Apache Airflow MCP Server docs, env vars, and workflow notes in Conare so your agent carries them across sessions.

Need the old visual installer? Open Conare IDE.
Open Conare