- Explore MCP Servers
- langgraph-mcp-pipeline
Langgraph Mcp Pipeline
What is Langgraph Mcp Pipeline
langgraph-mcp-pipeline is a demonstration project that utilizes the Model Context Protocol (MCP) in conjunction with LangGraph to create workflows for generating prompts and AI-generated images based on specific topics, incorporating Human-in-the-Loop interaction.
Use cases
Use cases for langgraph-mcp-pipeline include generating artwork for storytelling, creating educational visuals, assisting in marketing campaigns with AI-generated images, and developing interactive applications that require user feedback for content generation.
How to use
To use langgraph-mcp-pipeline, you need to run the scripts provided in the project, specifically ‘app.py’ and ‘graph.py’. These scripts allow you to generate prompts and images by providing a topic and collecting user feedback through a defined workflow.
Key features
Key features of langgraph-mcp-pipeline include the integration of Human-in-the-Loop interaction, the use of LangGraph Functional and Graph APIs, and the ability to generate AI image prompts and images based on user-defined topics.
Where to use
langgraph-mcp-pipeline can be used in various fields such as creative content generation, digital art, education, and any domain where AI-generated imagery and prompts can enhance user experience and engagement.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Langgraph Mcp Pipeline
langgraph-mcp-pipeline is a demonstration project that utilizes the Model Context Protocol (MCP) in conjunction with LangGraph to create workflows for generating prompts and AI-generated images based on specific topics, incorporating Human-in-the-Loop interaction.
Use cases
Use cases for langgraph-mcp-pipeline include generating artwork for storytelling, creating educational visuals, assisting in marketing campaigns with AI-generated images, and developing interactive applications that require user feedback for content generation.
How to use
To use langgraph-mcp-pipeline, you need to run the scripts provided in the project, specifically ‘app.py’ and ‘graph.py’. These scripts allow you to generate prompts and images by providing a topic and collecting user feedback through a defined workflow.
Key features
Key features of langgraph-mcp-pipeline include the integration of Human-in-the-Loop interaction, the use of LangGraph Functional and Graph APIs, and the ability to generate AI image prompts and images based on user-defined topics.
Where to use
langgraph-mcp-pipeline can be used in various fields such as creative content generation, digital art, education, and any domain where AI-generated imagery and prompts can enhance user experience and engagement.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
AI Image Generation Pipeline with LangGraph and MCP
This project demonstrates the use of the Model Context Protocol (MCP) with LangGraph to create workflows that generate prompts and AI-generated images based on a given topic. The project consists of three main files: app.py, graph.py, and ai-image-gen-pipeline.py. Each file showcases different aspects of using MCP with LangGraph, including the LangGraph Functional API, Graph API, and integration within Open WebUI Pipelines. These scripts utilize the Comfy MCP Server to generate AI image prompts and AI images.
Files
This script demonstrates the use of the LangGraph Functional API along with Human-in-the-Loop (HIL) interaction to generate prompts and AI-generated images based on a given topic. The workflow includes user feedback to approve generated prompts before generating the corresponding image.
Key Components:
- Dependencies:
aiosqlite,langgraph,langgraph-checkpoint-sqlite,mcp[cli]. - Functions:
run_tool(tool: str, args: dict) -> str: Runs a tool using the MCP server.generate_prompt(topic: str) -> str: Generates a prompt for a given topic.generate_image(prompt: str) -> str: Generates an image based on a given prompt.get_feedback(topic: str, prompt: str) -> str: Collects user feedback on the generated prompt.workflow_func(saver): Defines the workflow function with checkpointing.
- Main Function:
- Parses command-line arguments to get thread id and optionally the topic and feedback.
- Initializes the workflow and runs it, based on the provided input.
This script demonstrates the use of the LangGraph Graph API along with Human-in-the-Loop (HIL) interaction to generate prompts and AI-generated images based on a given topic. The workflow includes user feedback to approve generated prompts before generating the corresponding image.
Key Components:
- Dependencies:
aiosqlite,langgraph,langgraph-checkpoint-sqlite,mcp[cli]. - Functions:
run_tool(tool: str, args: dict) -> str: Runs a tool using the MCP server.generate_prompt(state: State) -> State: Generates a prompt for a given topic and updates the state.generate_image(state: State) -> State: Generates an image based on a given prompt and updates the state.prompt_feedback(state: State) -> State: Collects user feedback on the generated prompt.process_feedback(state: State) -> str: Processes the user feedback to determine the next step in the workflow.
- Main Function:
- Parses command-line arguments to get the thread ID, topic, and feedback.
- Initializes the state graph and runs it based on the provided input.
This script demonstrates the integration of LangGraph API with Human-in-the-Loop (HIL) within Open WebUI Pipelines. It defines a pipeline for generating prompts and images using MCP, including nodes for generating prompts, processing feedback, and generating images.
Key Components:
- Dependencies:
aiosqlite,langgraph,langgraph-checkpoint-sqlite,mcp[cli]. - Classes:
Pipeline: Defines the pipeline with nodes for generating prompts, processing feedback, and generating images.Valves(BaseModel): Contains environment variables for MCP server configuration.
- Functions:
inlet(body: dict, user: dict) -> dict: Processes incoming messages.outlet(body: dict, user: dict) -> dict: Processes outgoing messages.pipe(user_message: str, model_id: str, messages: List[dict], body: dict) -> Union[str, Generator, Iterator]: Defines the main pipeline logic.run_tool(tool: str, args: dict) -> str: Runs a tool using the MCP server.generate_prompt(state: State) -> State: Generates a prompt for a given topic and updates the state.generate_image(state: State) -> State: Generates an image based on a given prompt and updates the state.prompt_feedback(state: State) -> State: Collects user feedback on the generated prompt.process_feedback(state: State) -> str: Processes the user feedback to determine the next step in the workflow.
Usage
-
Install Dependencies: Ensure you have the required dependencies installed.
pip install aiosqlite langgraph langgraph-checkpoint-sqlite mcp[cli] comfy-mcp-server -
Run the Application:
-
For
app.py:python app.py --topic "Your topic here" -
For
graph.py:python graph.py --thread_id "your-thread-id" --topic "Your topic here"For feedback:
python graph.py --thread_id "your-thread-id" --feedback "y/n"
-
-
Using
uvUtility: You can also launchapp.pyandgraph.pyusing the uv utility. This utility manages Python version and dependency management, so there is no need to preinstall dependencies.-
For
app.py:uv run app.py --topic "Your topic here" -
For
graph.py:uv run graph.py --thread_id "your-thread-id" --topic "Your topic here"For feedback:
uv run graph.py --thread_id "your-thread-id" --feedback "y/n"
-
-
Environment Variables: Set the necessary environment variables for MCP server configuration.
export COMFY_URL="comfy-url" export COMFY_URL_EXTERNAL="comfy-url-external" export COMFY_WORKFLOW_JSON_FILE="path-to-workflow-json-file" export PROMPT_NODE_ID="prompt-node-id" export OUTPUT_NODE_ID="output-node-id" export OLLAMA_API_BASE="ollama-api-base" export PROMPT_LLM="prompt-llm"
Contributing
Feel free to contribute to this project by submitting pull requests or issues. Ensure that any changes are well-documented and tested.
License
This project is licensed under the MIT License.
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










