- Explore MCP Servers
- comfy-mcp-server
Comfy Mcp Server
What is Comfy Mcp Server
Comfy MCP Server is a server that utilizes the FastMCP framework to generate images based on user-defined prompts by interacting with a remote Comfy server.
Use cases
Use cases for Comfy MCP Server include generating artistic images based on textual descriptions, automating image creation for content generation, and facilitating creative projects that require visual outputs from textual inputs.
How to use
To use Comfy MCP Server, set the required environment variables such as COMFY_URL, COMFY_WORKFLOW_JSON_FILE, PROMPT_NODE_ID, and OUTPUT_NODE_ID. After configuration, launch the server using the command ‘uvx comfy-mcp-server’.
Key features
Key features include image generation from prompts, integration with Comfy server workflows, customizable output modes (URL or file), and the ability to connect with an Ollama server for enhanced prompt generation.
Where to use
undefined
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Comfy Mcp Server
Comfy MCP Server is a server that utilizes the FastMCP framework to generate images based on user-defined prompts by interacting with a remote Comfy server.
Use cases
Use cases for Comfy MCP Server include generating artistic images based on textual descriptions, automating image creation for content generation, and facilitating creative projects that require visual outputs from textual inputs.
How to use
To use Comfy MCP Server, set the required environment variables such as COMFY_URL, COMFY_WORKFLOW_JSON_FILE, PROMPT_NODE_ID, and OUTPUT_NODE_ID. After configuration, launch the server using the command ‘uvx comfy-mcp-server’.
Key features
Key features include image generation from prompts, integration with Comfy server workflows, customizable output modes (URL or file), and the ability to connect with an Ollama server for enhanced prompt generation.
Where to use
undefined
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
Comfy MCP Server
A server using FastMCP framework to generate images based on prompts via a remote Comfy server.
Overview
This script sets up a server using the FastMCP framework to generate images based on prompts using a specified workflow. It interacts with a remote Comfy server to submit prompts and retrieve generated images.
Prerequisites
- uv package and project manager for Python.
- Workflow file exported from Comfy UI. This code includes a sample
Flux-Dev-ComfyUI-Workflow.jsonwhich is only used here as reference. You will need to export from your workflow and set the environment variables accordingly.
You can install the required packages for local development:
uvx mcp[cli]
Configuration
Set the following environment variables:
COMFY_URLto point to your Comfy server URL.COMFY_WORKFLOW_JSON_FILEto point to the absolute path of the API export json file for the comfyui workflow.PROMPT_NODE_IDto the id of the text prompt node.OUTPUT_NODE_IDto the id of the output node with the final image.OUTPUT_MODEto eitherurlorfileto select desired output.
Optionally, if you have an Ollama server running, you can connect to it for prompt generation.
OLLAMA_API_BASEto the url where ollama is running.PROMPT_LLMto the name of the model hosted on ollama for prompt generation.
Example:
export COMFY_URL=http://your-comfy-server-url:port
export COMFY_WORKFLOW_JSON_FILE=/path/to/the/comfyui_workflow_export.json
export PROMPT_NODE_ID=6 # use the correct node id here
export OUTPUT_NODE_ID=9 # use the correct node id here
export OUTPUT_MODE=file
Usage
Comfy MCP Server can be launched by the following command:
uvx comfy-mcp-server
Example Claude Desktop Config
Functionality
generate_image(prompt: str, ctx: Context) -> Image | str
This function generates an image using a specified prompt. It follows these steps:
- Checks if all the environment variable are set.
- Loads a prompt template from a JSON file.
- Submits the prompt to the Comfy server.
- Polls the server for the status of the prompt processing.
- Retrieves and returns the generated image once it’s ready.
generate_prompt(topic: str, ctx: Context) -> str
This function generates a comprehensive image generation prompt from specified topic.
Dependencies
mcp: For setting up the FastMCP server.json: For handling JSON data.urllib: For making HTTP requests.time: For adding delays in polling.os: For accessing environment variables.langchain: For creating simple LLM Prompt chain to generate image generation prompt from topic.langchain-ollama: For ollama specific modules for LangChain.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










