- Explore MCP Servers
- cloudflare-ai-mcp
Cloudflare Ai Mcp
What is Cloudflare Ai Mcp
cloudflare-ai-mcp is a Model Context Protocol (MCP) server designed for Cloudflare’s AI API, enabling advanced image generation capabilities using multiple Stable Diffusion models simultaneously. It provides a robust interface for generating images with features like AI-powered prompt generation, batch operations, and comprehensive error handling.
Use cases
Use cases for cloudflare-ai-mcp include creating promotional images for marketing campaigns, generating unique artwork for games, producing illustrations for books, and developing visual content for social media.
How to use
To use cloudflare-ai-mcp, you need to set up a Cloudflare account and obtain an API token. Install the server via manual configuration or automatic installation using Smithery. Configure the MCP client settings and run the server using Python. You can generate images by calling the generate_images function with a prompt and size, or create prompts using the generate_prompt function.
Key features
Key features of cloudflare-ai-mcp include support for multiple Stable Diffusion models, concurrent image generation, AI-powered prompt generation, comprehensive error handling, and batch operations for generating multiple images in parallel.
Where to use
cloudflare-ai-mcp can be used in various fields including digital art, marketing, game development, and any application requiring high-quality image generation or manipulation.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Cloudflare Ai Mcp
cloudflare-ai-mcp is a Model Context Protocol (MCP) server designed for Cloudflare’s AI API, enabling advanced image generation capabilities using multiple Stable Diffusion models simultaneously. It provides a robust interface for generating images with features like AI-powered prompt generation, batch operations, and comprehensive error handling.
Use cases
Use cases for cloudflare-ai-mcp include creating promotional images for marketing campaigns, generating unique artwork for games, producing illustrations for books, and developing visual content for social media.
How to use
To use cloudflare-ai-mcp, you need to set up a Cloudflare account and obtain an API token. Install the server via manual configuration or automatic installation using Smithery. Configure the MCP client settings and run the server using Python. You can generate images by calling the generate_images function with a prompt and size, or create prompts using the generate_prompt function.
Key features
Key features of cloudflare-ai-mcp include support for multiple Stable Diffusion models, concurrent image generation, AI-powered prompt generation, comprehensive error handling, and batch operations for generating multiple images in parallel.
Where to use
cloudflare-ai-mcp can be used in various fields including digital art, marketing, game development, and any application requiring high-quality image generation or manipulation.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
AI Image Generation MCP Server
MCP Server for the Cloudflare AI API, enabling image generation operations with multiple models and prompt generation capabilities.
Overview
This MCP server provides tools to generate images using Cloudflare’s AI models while preserving quality and supporting multiple model types.
Features
- Multiple Model Support: Supports various Stable Diffusion models from Cloudflare AI
- Concurrent Generation: Generate images with multiple models simultaneously
- Prompt Generation: AI-powered prompt generation for better results
- Comprehensive Error Handling: Clear error messages for common issues
- Batch Operations: Support for generating multiple images in parallel
Tools
-
generate_images- Generate images using multiple models simultaneously
- Inputs:
prompt(string): Image generation promptsize_id(string): Image size (e.g., “1024x1024”)
- Returns: Dictionary of model IDs to generated images
-
generate_prompt- Generate detailed image prompts using AI
- Inputs:
theme(string): Basic theme to expand
- Returns: Detailed generation prompt
Setup
Prerequisites
- Python 3.10 or higher
- Sufficient disk space (1GB+ recommended)
- Stable network connection
- Cloudflare account and API token
Cloudflare API Token
Create a Cloudflare API Token with appropriate permissions:
-
Log in to Cloudflare Dashboard
-
Get Account ID:
- Click account icon in top right
- Select “Account Home”
- Find “Account ID” in overview
-
Create API Token:
- Go to “My Profile”
- Select “API Tokens”
- Click “Create Token”
- Choose “Create Custom Token”
- Add permissions:
- Account.AI - Read & Edit
- Account.Workers AI - Read & Edit
Installation
Option 1: Manual Installation via Configuration File
Add to your Claude Desktop config file:
- MacOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%/Claude/claude_desktop_config.json
{
"mcpServers": {
"CloudflareAI": {
"command": "uvx",
"args": [
"cloudflare-ai"
]
}
}
}
Option 2: Automatic Installation via Smithery
npx -y @smithery/cli install cloudflare-ai --client claude
Development
Building and Publishing
# Sync dependencies
uv sync
# Build package
uv build
# Publish to PyPI
uv publish
Debugging
For the best debugging experience, use the MCP Inspector:
npx @modelcontextprotocol/inspector python serv.py
Project Structure
cloudflare-ai/ ├── src/ │ └── cloudflare_ai/ │ ├── __init__.py │ ├── server.py │ ├── tools/ │ │ ├── __init__.py │ │ ├── generate_images.py │ │ └── generate_prompt.py │ └── utils/ │ ├── __init__.py │ ├── cloudflare.py │ └── logging.py ├── demo/ │ └── examples.py ├── tests/ │ └── test_tools.py ├── pyproject.toml ├── uv.lock ├── README.md └── README_CN.md
Configuration
Local Development
- Copy example config:
cp .env.example .env
- Edit configuration:
CLOUDFLARE_ACCOUNT_ID=your_account_id CLOUDFLARE_API_TOKEN=your_api_token
Production Environment
For production, use environment variables:
# Linux/Mac
export CLOUDFLARE_ACCOUNT_ID="your_account_id"
export CLOUDFLARE_API_TOKEN="your_api_token"
# Windows
set CLOUDFLARE_ACCOUNT_ID=your_account_id
set CLOUDFLARE_API_TOKEN=your_api_token
Running the Server
MCP Client Configuration
Add the following configuration to your MCP client settings:
{
"mcpServers": {
"CloudflareAI": {
"command": "python",
"args": [
"serv.py"
],
"env": {
"CLOUDFLARE_ACCOUNT_ID": "your_account_id",
"CLOUDFLARE_API_TOKEN": "your_api_token"
},
"cwd": "/path/to/server/directory"
}
}
}
Configuration options:
command: Python executable pathargs: Server script and argumentsenv: Environment variables for authenticationcwd: Working directory for the server
You can also use the server with custom port:
{
"mcpServers": {
"CloudflareAI": {
"command": "python",
"args": [
"serv.py",
"--port",
"8080"
],
"env": {
"CLOUDFLARE_ACCOUNT_ID": "your_account_id",
"CLOUDFLARE_API_TOKEN": "your_api_token"
}
}
}
}
Development Mode
# Run server in test mode
python serv.py
Production Deployment
Using uvicorn or gunicorn:
- Install server:
pip install uvicorn gunicorn
- Run with uvicorn:
uvicorn serv:app --host 0.0.0.0 --port 8000 --workers 4
- Or with gunicorn (Linux/Mac only):
gunicorn serv:app -w 4 -k uvicorn.workers.UvicornWorker -b 0.0.0.0:8000
Monitoring & Logging
Log Configuration
Create logging.conf:
import logging
logging.basicConfig(
level=logging.INFO,
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s',
handlers=[
logging.FileHandler('server.log'),
logging.StreamHandler()
]
)
Performance Monitoring
Recommended tools:
- Prometheus + Grafana
- New Relic
- Datadog
Security Recommendations
- Use HTTPS
- Implement rate limiting
- Keep dependencies updated
- Never hardcode credentials
- Set proper file permissions
Troubleshooting
Common Issues
-
API Credential Errors:
- Verify Account ID and API Token
- Check token permissions
-
Network Issues:
- Check firewall settings
- Verify network connectivity
-
Memory Issues:
- Adjust worker count
- Monitor memory usage
Debug Mode
# Run with debug mode enabled
DEBUG=1 python serv.py
Maintenance
- Update dependencies:
pip install --upgrade -r requirements.txt
- Monitor Cloudflare API changes
- Monitor disk usage
- Backup configuration files regularly
Support
For assistance:
- Check documentation
- Submit GitHub Issue
- Contact technical support
License
This MCP server is licensed under the MIT License.
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










