An MCP server for Apache Airflow that provides access to Airflow's REST API.
[!WARNING] This project has been relocated to the Astronomer agents monorepo.
Airflow MCP Server
A Model Context Protocol (MCP) server for Apache Airflow that provides AI assistants with access to Airflow's REST API. Built with FastMCP.
Quickstart
IDEs
Manual configuration
Add to your MCP settings (Cursor: ~/.cursor/mcp.json, VS Code: .vscode/mcp.json):
{
"mcpServers": {
"airflow": {
"command": "uvx",
"args": ["astro-airflow-mcp", "--transport", "stdio"]
}
}
}
CLI Tools
Claude Code
claude mcp add airflow -- uvx astro-airflow-mcp --transport stdio
Gemini CLI
gemini mcp add airflow -- uvx astro-airflow-mcp --transport stdio
Codex CLI
codex mcp add airflow -- uvx astro-airflow-mcp --transport stdio
Desktop Apps
Claude Desktop
Add to ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%\Claude\claude_desktop_config.json (Windows):
{
"mcpServers": {
"airflow": {
"command": "uvx",
"args": ["astro-airflow-mcp", "--transport", "stdio"]
}
}
}
Other MCP Clients
Manual JSON Configuration
Add to your MCP configuration file:
{
"mcpServers": {
"airflow": {
"command": "uvx",
"args": ["astro-airflow-mcp", "--transport", "stdio"]
}
}
}
Or connect to a running HTTP server: "url": "http://localhost:8000/mcp"
Note: No installation required -
uvxruns directly from PyPI. The--transport stdioflag is required because the server defaults to HTTP mode.
Configuration
By default, the server connects to http://localhost:8080 (Astro CLI default). Set environment variables for custom Airflow instances:
| Variable | Description |
|---|---|
AIRFLOW_API_URL |
Airflow webserver URL |
AIRFLOW_USERNAME |
Username (Airflow 3.x uses OAuth2 token exchange) |
AIRFLOW_PASSWORD |
Password |
AIRFLOW_AUTH_TOKEN |
Bearer token (alternative to username/password) |
Example with auth (Claude Code):
claude mcp add airflow -e AIRFLOW_API_URL=https://your-airflow.example.com -e AIRFLOW_USERNAME=admin -e AIRFLOW_PASSWORD=admin -- uvx astro-airflow-mcp --transport stdio
Features
- Airflow 2.x and 3.x Support: Automatic version detection with adapter pattern
- MCP Tools for accessing Airflow data:
- DAG management (list, get details, get source code, stats, warnings, import errors, trigger, pause/unpause)
- Task management (list, get details, get task instances, get logs)
- Pool management (list, get details)
- Variable management (list, get specific variables)
- Connection management (list connections with credentials excluded)
- Asset/Dataset management (unified naming across versions, data lineage)
- Plugin and provider information
- Configuration and version details
- Consolidated Tools for agent workflows:
explore_dag: Get comprehensive DAG information in one calldiagnose_dag_run: Debug failed DAG runs with task instance detailsget_system_health: System overview with health, errors, and warnings
- MCP Resources: Static Airflow info exposed as resources (version, providers, plugins, config)
- MCP Prompts: Guided workflows for common tasks (troubleshooting, health checks, onboarding)
- Dual deployment modes:
Tools (3)
explore_dagGet comprehensive DAG information in one call.diagnose_dag_runDebug failed DAG runs with task instance details.get_system_healthGet a system overview with health, errors, and warnings.Environment Variables
AIRFLOW_API_URLAirflow webserver URLAIRFLOW_USERNAMEUsername for authenticationAIRFLOW_PASSWORDPassword for authenticationAIRFLOW_AUTH_TOKENBearer token for authenticationConfiguration
{"mcpServers": {"airflow": {"command": "uvx", "args": ["astro-airflow-mcp", "--transport", "stdio"]}}}