- Explore MCP Servers
- mcp-airflow-openwebui
Mcp Airflow Openwebui
What is Mcp Airflow Openwebui
mcp-airflow-openwebui is a Model Context Protocol (MCP) server that integrates Apache Airflow with an Open Web UI, enabling standardized access to DAG metadata, run status, and task insights.
Use cases
Use cases include monitoring DAG execution status, automating task management, and integrating with other MCP clients for enhanced data processing workflows.
How to use
To use mcp-airflow-openwebui, set up the Airflow environment using Docker Compose, access the Airflow web interface, and run the MPC server after setting up a virtual environment and installing dependencies.
Key features
Key features include standardized interaction with Apache Airflow’s REST API, seamless integration with MCP clients, and the ability to monitor and automate workflows effectively.
Where to use
mcp-airflow-openwebui can be used in data engineering, workflow automation, and any domain requiring orchestration of complex data pipelines.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Mcp Airflow Openwebui
mcp-airflow-openwebui is a Model Context Protocol (MCP) server that integrates Apache Airflow with an Open Web UI, enabling standardized access to DAG metadata, run status, and task insights.
Use cases
Use cases include monitoring DAG execution status, automating task management, and integrating with other MCP clients for enhanced data processing workflows.
How to use
To use mcp-airflow-openwebui, set up the Airflow environment using Docker Compose, access the Airflow web interface, and run the MPC server after setting up a virtual environment and installing dependencies.
Key features
Key features include standardized interaction with Apache Airflow’s REST API, seamless integration with MCP clients, and the ability to monitor and automate workflows effectively.
Where to use
mcp-airflow-openwebui can be used in data engineering, workflow automation, and any domain requiring orchestration of complex data pipelines.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
AirTrack
A Model Context Protocol (MCP) server for Apache Airflow that enables standardized access to DAG metadata, run status, and task insights, allowing seamless integration with MCP clients for monitoring and automation.
About
This project implements a Model Context Protocol server that wraps Apache Airflow’s REST API, allowing MCP clients to interact with Airflow in a standardized way. It uses the official Apache Airflow client library to ensure compatibility and maintainability.
Project Structure
combined_project/ ├── airflow/ # Airflow project files │ ├── dags/ # Airflow DAG definitions │ ├── logs/ # Airflow logs │ ├── plugins/ # Airflow plugins │ └── Docker-compose.yaml # Docker compose file for Airflow │ └── mpc/ # MPC application files ├── utils/ # Utility functions ├── server.py # Main server file └── main.py # Entry point
Running the Projects
Requirements
- Docker and Docker Compose for Airflow
- Python 3.8+ for MPC application
- Virtual environment for MPC application
Airflow
-
Navigate to the airflow directory:
cd airflow
-
Start Airflow using Docker Compose:
docker-compose up
-
Access the Airflow web interface at http://localhost:8181
Username:
admin
Password:airflow
MPC Application
-
Navigate to the mpc directory:
cd mpc
-
Create and activate a virtual environment:
python -m venv .venv .venv\Scripts\activate # On Windows source .venv/bin/activate # On Unix/MacOS
-
Install dependencies:
pip install -r requirements.txt
-
Run the MPC server:
python server.py
Usage with Claude Desktop
{
"mcpServers": {
"FlowPredictor": {
"command": "D:\\Apps\\conda\\Scripts\\uv.EXE",
"args": [
"run",
"--with",
"mcp[cli]",
"mcp",
"run",
"<---PATH OF YOUR SERVER FILE eg(C:\\Users\\..\\..\\..\\server.py) --->"
]
}
}
}
Integration
The Airflow DAGs can interact with the MPC application through API calls. Make sure both services are running when executing workflows that require MPC functionality.
Future Development
-
🔄 Live Updates – Stream DAG/task status via WebSocket or SSE.
-
🔐 Security – Add OAuth2, API keys, and role-based access.
-
⚡ Event Triggers – Auto-trigger agents on DAG events.
-
A📊 Analytics – Dashboard for DAG performance and trends.
-
🤖 AI Troubleshooting – Use LLMs for issue analysis and fixes.
-
Integrate with OpenWebUi
-
- install MCPO
pip install mcpo
2.create config.js in mcp folder
{
"mcpServers": {
"airflow-mcp-server": {
"command": "C:\\Users\\RakeshReddyBijjam\\pipx\\venvs\\meltano\\Scripts\\uv.EXE",
"args": [
"run",
"--with",
"mcp[cli]",
"mcp",
"run",
"C:\\Users\\RakeshReddyBijjam\\Desktop\\claude_sam\\AirTrack\\mcp\\server.py"
]
}
}
}
- Run the server
uvx mcpo --config config.json --port 8001
DevTools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.