- Explore MCP Servers
- mcpo-heroku
Mcpo Heroku
What is Mcpo Heroku
mcpo-heroku is a proxy tool that exposes any MCP server command as an OpenAPI-compatible HTTP server. It simplifies the integration of MCP tools with LLM agents and applications that expect OpenAPI servers.
Use cases
Use cases for mcpo-heroku include making AI tools accessible via standard APIs, integrating legacy MCP tools into modern applications, and providing secure and scalable access to machine learning models.
How to use
To use mcpo-heroku, you can run it using the command line with the ‘uv’ command or Python. For example, use ‘uvx mcpo --port 8000 --api-key “top-secret” – your_mcp_server_command’ or install it via pip and run ‘mcpo --port 8000 --api-key “top-secret” – your_mcp_server_command’.
Key features
Key features include instant compatibility with OpenAPI tools, enhanced security and stability, auto-generated interactive documentation, and the use of pure HTTP without the need for sockets or additional code.
Where to use
mcpo-heroku can be used in various fields where MCP tools need to be integrated with web applications, AI services, and any environment that requires RESTful API access to MCP functionalities.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Mcpo Heroku
mcpo-heroku is a proxy tool that exposes any MCP server command as an OpenAPI-compatible HTTP server. It simplifies the integration of MCP tools with LLM agents and applications that expect OpenAPI servers.
Use cases
Use cases for mcpo-heroku include making AI tools accessible via standard APIs, integrating legacy MCP tools into modern applications, and providing secure and scalable access to machine learning models.
How to use
To use mcpo-heroku, you can run it using the command line with the ‘uv’ command or Python. For example, use ‘uvx mcpo --port 8000 --api-key “top-secret” – your_mcp_server_command’ or install it via pip and run ‘mcpo --port 8000 --api-key “top-secret” – your_mcp_server_command’.
Key features
Key features include instant compatibility with OpenAPI tools, enhanced security and stability, auto-generated interactive documentation, and the use of pure HTTP without the need for sockets or additional code.
Where to use
mcpo-heroku can be used in various fields where MCP tools need to be integrated with web applications, AI services, and any environment that requires RESTful API access to MCP functionalities.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
⚡️ mcpo
Note: This is a clone of the original repository at https://github.com/open-webui/mcpo.git with added support for Heroku deployment and dynamic env variables
Expose any MCP tool as an OpenAPI-compatible HTTP server—instantly.
mcpo is a dead-simple proxy that takes an MCP server command and makes it accessible via standard RESTful OpenAPI, so your tools “just work” with LLM agents and apps expecting OpenAPI servers.
No custom protocol. No glue code. No hassle.
🤔 Why Use mcpo Instead of Native MCP?
MCP servers usually speak over raw stdio, which is:
- 🔓 Inherently insecure
- ❌ Incompatible with most tools
- 🧩 Missing standard features like docs, auth, error handling, etc.
mcpo solves all of that—without extra effort:
- ✅ Works instantly with OpenAPI tools, SDKs, and UIs
- 🛡 Adds security, stability, and scalability using trusted web standards
- 🧠 Auto-generates interactive docs for every tool, no config needed
- 🔌 Uses pure HTTP—no sockets, no glue code, no surprises
What feels like “one more step” is really fewer steps with better outcomes.
mcpo makes your AI tools usable, secure, and interoperable—right now, with zero hassle.
🚀 Quick Usage
We recommend using uv for lightning-fast startup and zero config.
uvx mcpo --port 8000 --api-key "top-secret" -- your_mcp_server_command
Or, if you’re using Python:
pip install mcpo
mcpo --port 8000 --api-key "top-secret" -- your_mcp_server_command
To use an SSE-compatible MCP server, simply specify the server type and endpoint:
mcpo --port 8000 --api-key "top-secret" --server-type "sse" -- http://127.0.0.1:8001/sse
Docker
You can run mcpo via Docker with the included Dockerfile:
# Build the Docker image
docker build -t my-mcpo .
# Run with a configuration file
docker run -p 8000:8000 \
-v "$(pwd)/mcp.json:/app/mcp.json" \
-v "$(pwd)/custom:/app/custom" \
my-mcpo --api-key "top-secret" --config "/app/mcp.json"
For convenience, a startup script is included:
# Make the script executable
chmod +x start.sh
# Run with default settings
./start.sh
# Or with custom parameters
./start.sh --api-key "your-api-key" --config "your-config.json" --port 9000
# Force a rebuild of the Docker image
./start.sh --build
By default, the script will:
- Build the image only if it doesn’t exist yet
- Use the existing image if available
- Rebuild only when the
--buildflag is specified
Example:
uvx mcpo --port 8000 --api-key "top-secret" -- uvx mcp-server-time --local-timezone=America/New_York
That’s it. Your MCP tool is now available at http://localhost:8000 with a generated OpenAPI schema — test it live at http://localhost:8000/docs.
🤝 To integrate with Open WebUI after launching the server, check our docs.
🔄 Using a Config File
You can serve multiple MCP tools via a single config file that follows the Claude Desktop format:
Start via:
mcpo --config /path/to/config.json
Example config.json:
Each tool will be accessible under its own unique route, e.g.:
Each with a dedicated OpenAPI schema and proxy handler. Access full schema UI at: http://localhost:8000/<tool>/docs (e.g. /memory/docs, /time/docs)
🔧 Requirements
- Python 3.8+
- uv (optional, but highly recommended for performance + packaging)
🛠️ Development & Testing
To contribute or run tests locally:
-
Set up the environment:
# Clone the repository git clone https://github.com/flemx/mcpo-heroku cd mcpo # Install dependencies (including dev dependencies) uv sync --dev -
Run tests:
uv run pytest
🚀 Heroku Deployment
Enhanced Feature: This fork adds full Heroku deployment support!
You can deploy mcpo directly to Heroku using the included heroku.yml file:
# Login to Heroku
heroku login
# Create a new Heroku app (or use an existing one)
heroku create your-app-name
# Set the stack to container
heroku stack:set container -a app-name
# Set required environment variables
heroku config:set API_KEY=your-secret-api-key
# Push to Heroku
git push heroku main
The deployment automatically:
- Builds the Docker image using the Dockerfile
- Uses your API_KEY environment variable for authentication
- Uses Heroku’s dynamic PORT assignment
- Runs with the mcp.json config file
To check your logs after deployment:
heroku logs --tail
Your MCProxy API will be available at https://your-app-name.herokuapp.com/
🌐 Environment Variables
mcpo now supports loading environment variables from a .env file. This is useful for keeping sensitive information like API keys out of your codebase.
- Create a
.envfile in your project root:
# API Keys OPENAI_API_KEY=your_openai_api_key_here # Salesforce Credentials SALESFORCE_USERNAME=your_salesforce_username SALESFORCE_PASSWORD=your_salesforce_password # MCPO Settings API_KEY=top-secret
- Update your
mcp.jsonto use environment variables:
- For Heroku, set these same variables using:
heroku config:set OPENAI_API_KEY=your_key
# etc.
🪪 License
MIT
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










