- Explore MCP Servers
- mcp-server-apache-airflow
Mcp Server Apache Airflow
What is Mcp Server Apache Airflow
mcp-server-apache-airflow is an implementation of a Model Context Protocol (MCP) server designed for Apache Airflow. It enables seamless integration with MCP clients, providing a standardized way to interact with Apache Airflow through its REST API.
Use cases
Use cases for mcp-server-apache-airflow include automating data workflows, managing task dependencies, monitoring DAG runs, and integrating with other systems through standardized API calls. It is suitable for data engineers and developers working with Apache Airflow.
How to use
To use mcp-server-apache-airflow, set up the required environment variables for your Airflow host, username, and password. Then, configure it in your ‘claude_desktop_config.json’ file. After installation, you can interact with Apache Airflow using the provided API endpoints.
Key features
Key features include comprehensive DAG management, task handling, variable and connection management, monitoring capabilities, and support for datasets and XComs. The server wraps the Airflow REST API, ensuring compatibility and ease of use.
Where to use
undefined
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Mcp Server Apache Airflow
mcp-server-apache-airflow is an implementation of a Model Context Protocol (MCP) server designed for Apache Airflow. It enables seamless integration with MCP clients, providing a standardized way to interact with Apache Airflow through its REST API.
Use cases
Use cases for mcp-server-apache-airflow include automating data workflows, managing task dependencies, monitoring DAG runs, and integrating with other systems through standardized API calls. It is suitable for data engineers and developers working with Apache Airflow.
How to use
To use mcp-server-apache-airflow, set up the required environment variables for your Airflow host, username, and password. Then, configure it in your ‘claude_desktop_config.json’ file. After installation, you can interact with Apache Airflow using the provided API endpoints.
Key features
Key features include comprehensive DAG management, task handling, variable and connection management, monitoring capabilities, and support for datasets and XComs. The server wraps the Airflow REST API, ensuring compatibility and ease of use.
Where to use
undefined
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
mcp-server-apache-airflow
A Model Context Protocol (MCP) server implementation for Apache Airflow, enabling seamless integration with MCP clients. This project provides a standardized way to interact with Apache Airflow through the Model Context Protocol.
About
This project implements a Model Context Protocol server that wraps Apache Airflow’s REST API, allowing MCP clients to interact with Airflow in a standardized way. It uses the official Apache Airflow client library to ensure compatibility and maintainability.
Feature Implementation Status
| Feature | API Path | Status |
|---|---|---|
| DAG Management | ||
| List DAGs | /api/v1/dags |
✅ |
| Get DAG Details | /api/v1/dags/{dag_id} |
✅ |
| Pause DAG | /api/v1/dags/{dag_id} |
✅ |
| Unpause DAG | /api/v1/dags/{dag_id} |
✅ |
| Update DAG | /api/v1/dags/{dag_id} |
✅ |
| Delete DAG | /api/v1/dags/{dag_id} |
✅ |
| Get DAG Source | /api/v1/dagSources/{file_token} |
✅ |
| Patch Multiple DAGs | /api/v1/dags |
✅ |
| Reparse DAG File | /api/v1/dagSources/{file_token}/reparse |
✅ |
| DAG Runs | ||
| List DAG Runs | /api/v1/dags/{dag_id}/dagRuns |
✅ |
| Create DAG Run | /api/v1/dags/{dag_id}/dagRuns |
✅ |
| Get DAG Run Details | /api/v1/dags/{dag_id}/dagRuns/{dag_run_id} |
✅ |
| Update DAG Run | /api/v1/dags/{dag_id}/dagRuns/{dag_run_id} |
✅ |
| Delete DAG Run | /api/v1/dags/{dag_id}/dagRuns/{dag_run_id} |
✅ |
| Get DAG Runs Batch | /api/v1/dags/~/dagRuns/list |
✅ |
| Clear DAG Run | /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/clear |
✅ |
| Set DAG Run Note | /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/setNote |
✅ |
| Get Upstream Dataset Events | /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/upstreamDatasetEvents |
✅ |
| Tasks | ||
| List DAG Tasks | /api/v1/dags/{dag_id}/tasks |
✅ |
| Get Task Details | /api/v1/dags/{dag_id}/tasks/{task_id} |
✅ |
| Get Task Instance | /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id} |
✅ |
| List Task Instances | /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances |
✅ |
| Update Task Instance | /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id} |
✅ |
| Clear Task Instances | /api/v1/dags/{dag_id}/clearTaskInstances |
✅ |
| Set Task Instances State | /api/v1/dags/{dag_id}/updateTaskInstancesState |
✅ |
| Variables | ||
| List Variables | /api/v1/variables |
✅ |
| Create Variable | /api/v1/variables |
✅ |
| Get Variable | /api/v1/variables/{variable_key} |
✅ |
| Update Variable | /api/v1/variables/{variable_key} |
✅ |
| Delete Variable | /api/v1/variables/{variable_key} |
✅ |
| Connections | ||
| List Connections | /api/v1/connections |
✅ |
| Create Connection | /api/v1/connections |
✅ |
| Get Connection | /api/v1/connections/{connection_id} |
✅ |
| Update Connection | /api/v1/connections/{connection_id} |
✅ |
| Delete Connection | /api/v1/connections/{connection_id} |
✅ |
| Test Connection | /api/v1/connections/test |
✅ |
| Pools | ||
| List Pools | /api/v1/pools |
✅ |
| Create Pool | /api/v1/pools |
✅ |
| Get Pool | /api/v1/pools/{pool_name} |
✅ |
| Update Pool | /api/v1/pools/{pool_name} |
✅ |
| Delete Pool | /api/v1/pools/{pool_name} |
✅ |
| XComs | ||
| List XComs | /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries |
✅ |
| Get XCom Entry | /api/v1/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries/{xcom_key} |
✅ |
| Datasets | ||
| List Datasets | /api/v1/datasets |
✅ |
| Get Dataset | /api/v1/datasets/{uri} |
✅ |
| Get Dataset Events | /api/v1/datasetEvents |
✅ |
| Create Dataset Event | /api/v1/datasetEvents |
✅ |
| Get DAG Dataset Queued Event | /api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents/{uri} |
✅ |
| Get DAG Dataset Queued Events | /api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents |
✅ |
| Delete DAG Dataset Queued Event | /api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents/{uri} |
✅ |
| Delete DAG Dataset Queued Events | /api/v1/dags/{dag_id}/dagRuns/queued/datasetEvents |
✅ |
| Get Dataset Queued Events | /api/v1/datasets/{uri}/dagRuns/queued/datasetEvents |
✅ |
| Delete Dataset Queued Events | /api/v1/datasets/{uri}/dagRuns/queued/datasetEvents |
✅ |
| Monitoring | ||
| Get Health | /api/v1/health |
✅ |
| DAG Stats | ||
| Get DAG Stats | /api/v1/dags/statistics |
✅ |
| Config | ||
| Get Config | /api/v1/config |
✅ |
| Plugins | ||
| Get Plugins | /api/v1/plugins |
✅ |
| Providers | ||
| List Providers | /api/v1/providers |
✅ |
| Event Logs | ||
| List Event Logs | /api/v1/eventLogs |
✅ |
| Get Event Log | /api/v1/eventLogs/{event_log_id} |
✅ |
| System | ||
| Get Import Errors | /api/v1/importErrors |
✅ |
| Get Import Error Details | /api/v1/importErrors/{import_error_id} |
✅ |
| Get Health Status | /api/v1/health |
✅ |
| Get Version | /api/v1/version |
✅ |
Setup
Dependencies
This project depends on the official Apache Airflow client library (apache-airflow-client). It will be automatically installed when you install this package.
Environment Variables
Set the following environment variables:
AIRFLOW_HOST=<your-airflow-host> AIRFLOW_USERNAME=<your-airflow-username> AIRFLOW_PASSWORD=<your-airflow-password> AIRFLOW_API_VERSION=v1 # Optional, defaults to v1
Usage with Claude Desktop
Add to your claude_desktop_config.json:
{
"mcpServers": {
"mcp-server-apache-airflow": {
"command": "uvx",
"args": [
"mcp-server-apache-airflow"
],
"env": {
"AIRFLOW_HOST": "https://your-airflow-host",
"AIRFLOW_USERNAME": "your-username",
"AIRFLOW_PASSWORD": "your-password"
}
}
}
}
Alternative configuration using uv:
{
"mcpServers": {
"mcp-server-apache-airflow": {
"command": "uv",
"args": [
"--directory",
"/path/to/mcp-server-apache-airflow",
"run",
"mcp-server-apache-airflow"
],
"env": {
"AIRFLOW_HOST": "https://your-airflow-host",
"AIRFLOW_USERNAME": "your-username",
"AIRFLOW_PASSWORD": "your-password"
}
}
}
}
Replace /path/to/mcp-server-apache-airflow with the actual path where you’ve cloned the repository.
Selecting the API groups
You can select the API groups you want to use by setting the --apis flag.
uv run mcp-server-apache-airflow --apis "dag,dagrun"
The default is to use all APIs.
Allowed values are:
- config
- connections
- dag
- dagrun
- dagstats
- dataset
- eventlog
- importerror
- monitoring
- plugin
- pool
- provider
- taskinstance
- variable
- xcom
Manual Execution
You can also run the server manually:
make run
make run accepts following options:
Options:
--port: Port to listen on for SSE (default: 8000)--transport: Transport type (stdio/sse, default: stdio)
Or, you could run the sse server directly, which accepts same parameters:
make run-sse
Installing via Smithery
To install Apache Airflow MCP Server for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @yangkyeongmo/mcp-server-apache-airflow --client claude
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
The package is deployed automatically to PyPI when project.version is updated in pyproject.toml.
Follow semver for versioning.
Please include version update in the PR in order to apply the changes to core logic.
License
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.











