Astro Airflow MCP Server

1

Add it to Claude Code

Run this in a terminal.

Run in terminal
claude mcp add astro-airflow -- uvx astro-airflow-mcp --transport stdio
README.md

An MCP server for Apache Airflow that provides access to Airflow's REST API.

[!WARNING] This project has been relocated to the Astronomer agents monorepo.


Airflow MCP Server

A Model Context Protocol (MCP) server for Apache Airflow that provides AI assistants with access to Airflow's REST API. Built with FastMCP.

Quickstart

IDEs

Manual configuration

Add to your MCP settings (Cursor: ~/.cursor/mcp.json, VS Code: .vscode/mcp.json):

{
  "mcpServers": {
    "airflow": {
      "command": "uvx",
      "args": ["astro-airflow-mcp", "--transport", "stdio"]
    }
  }
}

CLI Tools

Claude Code
claude mcp add airflow -- uvx astro-airflow-mcp --transport stdio
Gemini CLI
gemini mcp add airflow -- uvx astro-airflow-mcp --transport stdio
Codex CLI
codex mcp add airflow -- uvx astro-airflow-mcp --transport stdio

Desktop Apps

Claude Desktop

Add to ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows):

{
  "mcpServers": {
    "airflow": {
      "command": "uvx",
      "args": ["astro-airflow-mcp", "--transport", "stdio"]
    }
  }
}

Other MCP Clients

Manual JSON Configuration

Add to your MCP configuration file:

{
  "mcpServers": {
    "airflow": {
      "command": "uvx",
      "args": ["astro-airflow-mcp", "--transport", "stdio"]
    }
  }
}

Or connect to a running HTTP server: "url": "http://localhost:8000/mcp"

Note: No installation required - uvx runs directly from PyPI. The --transport stdio flag is required because the server defaults to HTTP mode.

Configuration

By default, the server connects to http://localhost:8080 (Astro CLI default). Set environment variables for custom Airflow instances:

Variable Description
AIRFLOW_API_URL Airflow webserver URL
AIRFLOW_USERNAME Username (Airflow 3.x uses OAuth2 token exchange)
AIRFLOW_PASSWORD Password
AIRFLOW_AUTH_TOKEN Bearer token (alternative to username/password)

Example with auth (Claude Code):

claude mcp add airflow -e AIRFLOW_API_URL=https://your-airflow.example.com -e AIRFLOW_USERNAME=admin -e AIRFLOW_PASSWORD=admin -- uvx astro-airflow-mcp --transport stdio

Features

  • Airflow 2.x and 3.x Support: Automatic version detection with adapter pattern
  • MCP Tools for accessing Airflow data:
    • DAG management (list, get details, get source code, stats, warnings, import errors, trigger, pause/unpause)
    • Task management (list, get details, get task instances, get logs)
    • Pool management (list, get details)
    • Variable management (list, get specific variables)
    • Connection management (list connections with credentials excluded)
    • Asset/Dataset management (unified naming across versions, data lineage)
    • Plugin and provider information
    • Configuration and version details
  • Consolidated Tools for agent workflows:
    • explore_dag: Get comprehensive DAG information in one call
    • diagnose_dag_run: Debug failed DAG runs with task instance details
    • get_system_health: System overview with health, errors, and warnings
  • MCP Resources: Static Airflow info exposed as resources (version, providers, plugins, config)
  • MCP Prompts: Guided workflows for common tasks (troubleshooting, health checks, onboarding)
  • Dual deployment modes:

Tools (3)

explore_dagGet comprehensive DAG information in one call.
diagnose_dag_runDebug failed DAG runs with task instance details.
get_system_healthGet a system overview with health, errors, and warnings.

Environment Variables

AIRFLOW_API_URLAirflow webserver URL
AIRFLOW_USERNAMEUsername for authentication
AIRFLOW_PASSWORDPassword for authentication
AIRFLOW_AUTH_TOKENBearer token for authentication

Configuration

claude_desktop_config.json
{"mcpServers": {"airflow": {"command": "uvx", "args": ["astro-airflow-mcp", "--transport", "stdio"]}}}

Try it

Check the system health of my Airflow instance and report any active warnings.
Explore the details of the 'daily_data_sync' DAG and list its recent task instances.
Diagnose why the latest run of the 'etl_pipeline' DAG failed.
List all active DAGs and identify any that have failed in the last 24 hours.

Frequently Asked Questions

What are the key features of Astro Airflow?

Support for both Airflow 2.x and 3.x versions. Comprehensive DAG management including trigger, pause, and source code retrieval. Consolidated diagnostic tools for debugging failed DAG runs. Exposes static Airflow info like providers and plugins as MCP resources. Guided MCP prompts for troubleshooting and health checks.

What can I use Astro Airflow for?

Automated debugging of failed data pipelines by AI agents. Monitoring system health and infrastructure status via natural language. Quickly retrieving DAG source code and configuration details for developers. Managing workflow execution states without leaving the AI chat interface.

How do I install Astro Airflow?

Install Astro Airflow by running: uvx astro-airflow-mcp --transport stdio

What MCP clients work with Astro Airflow?

Astro Airflow works with any MCP-compatible client including Claude Desktop, Claude Code, Cursor, and other editors with MCP support.

Turn this server into reusable context

Keep Astro Airflow docs, env vars, and workflow notes in Conare so your agent carries them across sessions.

Need the old visual installer? Open Conare IDE.
Open Conare