- Explore MCP Servers
- nchan-mcp-transport
Nchan Mcp Transport
What is Nchan Mcp Transport
Nchan MCP Transport is a high-performance WebSocket/SSE transport layer and gateway designed for Anthropic’s MCP (Model Context Protocol). It enables real-time communication between MCP clients and various tools and services, leveraging Nginx, Nchan, and FastAPI.
Use cases
Use cases include developing Claude plugins for enhanced functionalities, integrating external APIs for data retrieval or processing, and creating LLM agents that require real-time communication with various services.
How to use
To use Nchan MCP Transport, set up the server with Docker Compose, configure your WebSocket or SSE endpoints, and integrate it with your MCP clients like Claude. Utilize Python decorators to register tools and resources, and take advantage of the built-in OpenAPI integration for seamless API management.
Key features
Key features include dual protocol support for WebSocket and SSE, high-performance pub/sub capabilities, full compliance with the Model Context Protocol, OpenAPI integration for auto-generating tools, a tool/resource registration system using Python decorators, asynchronous execution with background task queues, and easy deployment via Docker.
Where to use
Nchan MCP Transport is ideal for AI developers working on real-time, scalable integrations with AI models like Claude, building plugins, LLM agents, or connecting external APIs to Claude through the MCP.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Nchan Mcp Transport
Nchan MCP Transport is a high-performance WebSocket/SSE transport layer and gateway designed for Anthropic’s MCP (Model Context Protocol). It enables real-time communication between MCP clients and various tools and services, leveraging Nginx, Nchan, and FastAPI.
Use cases
Use cases include developing Claude plugins for enhanced functionalities, integrating external APIs for data retrieval or processing, and creating LLM agents that require real-time communication with various services.
How to use
To use Nchan MCP Transport, set up the server with Docker Compose, configure your WebSocket or SSE endpoints, and integrate it with your MCP clients like Claude. Utilize Python decorators to register tools and resources, and take advantage of the built-in OpenAPI integration for seamless API management.
Key features
Key features include dual protocol support for WebSocket and SSE, high-performance pub/sub capabilities, full compliance with the Model Context Protocol, OpenAPI integration for auto-generating tools, a tool/resource registration system using Python decorators, asynchronous execution with background task queues, and easy deployment via Docker.
Where to use
Nchan MCP Transport is ideal for AI developers working on real-time, scalable integrations with AI models like Claude, building plugins, LLM agents, or connecting external APIs to Claude through the MCP.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
🚀 Nchan MCP Transport
A high-performance WebSocket/SSE transport layer & gateway for Anthropic’s MCP (Model Context Protocol) — powered by Nginx, Nchan, and FastAPI.
For building real-time, scalable AI integrations with Claude and other LLM agents.
✨ What is this?
Nchan MCP Transport provides a real-time API gateway for MCP clients (like Claude) to talk to your tools and services over:
- 🧵 WebSocket or Server-Sent Events (SSE)
- ⚡️ Streamable HTTP compatible
- 🧠 Powered by Nginx + Nchan for low-latency pub/sub
- 🛠 Integrates with FastAPI for backend logic and OpenAPI tooling
✅ Ideal for AI developers building Claude plugins, LLM agents, or integrating external APIs into Claude via MCP.
🧩 Key Features
| Feature | Description |
|---|---|
| 🔄 Dual Protocol Support | Seamlessly supports WebSocket and SSE with automatic detection |
| 🚀 High Performance Pub/Sub | Built on Nginx + Nchan, handles thousands of concurrent connections |
| 🔌 MCP-Compliant Transport | Fully implements Model Context Protocol (JSON-RPC 2.0) |
| 🧰 OpenAPI Integration | Auto-generate MCP tools from any OpenAPI spec |
| 🪝 Tool / Resource System | Use Python decorators to register tools and resources |
| 📡 Asynchronous Execution | Background task queue + live progress updates via push notifications |
| 🧱 Dockerized Deployment | Easily spin up with Docker Compose |
🧠 Why Use This?
MCP lets AI assistants like Claude talk to external tools. But:
- Native MCP is HTTP+SSE, which struggles with long tasks, network instability, and high concurrency
- WebSockets aren’t natively supported by Claude — this project bridges the gap
- Server-side logic in pure Python (like
FastMCP) may not scale under load
✅ Nchan MCP Transport gives you:
- Web-scale performance (Nginx/Nchan)
- FastAPI-powered backend for tools
- Real-time event delivery to Claude clients
- Plug-and-play OpenAPI to Claude integration
🚀 Quickstart
📦 1. Install server SDK
pip install httmcp
🧪 2. Run demo in Docker
git clone https://github.com/yourusername/nchan-mcp-transport.git
cd nchan-mcp-transport
docker-compose up -d
🛠 3. Define your tool
@server.tool()
async def search_docs(query: str) -> str:
return f"Searching for {query}..."
🧬 4. Expose OpenAPI service (optional)
openapi_server = await OpenAPIMCP.from_openapi("https://example.com/openapi.json", publish_server="http://nchan:80")
app.include_router(openapi_server.router)
🖥️ 5. One-Click GPTs Actions to MCP Deployment
HTTMCP provides a powerful CLI for instant deployment of GPTs Actions to MCP servers:
# Installation
pip install httmcp[cli]
# One-click deployment from GPTs Actions OpenAPI spec
python -m httmcp -f gpt_actions_openapi.json -p http://nchan:80
📚 Use Cases
- Claude plugin server over WebSocket/SSE
- Real-time LLM agent backend (LangChain/AutoGen style)
- Connect Claude to internal APIs (via OpenAPI)
- High-performance tool/service bridge for MCP
🔒 Requirements
- Nginx with Nchan module (pre-installed in Docker image)
- Python 3.9+
- Docker / Docker Compose
🛠 Tech Stack
- 🧩 Nginx + Nchan – persistent connection management & pub/sub
- ⚙️ FastAPI – backend logic & JSON-RPC routing
- 🐍 HTTMCP SDK – full MCP protocol implementation
- 🐳 Docker – deployment ready
📎 Keywords
mcp transport, nchan websocket, sse for anthropic, mcp jsonrpc gateway, claude plugin backend, streamable http, real-time ai api gateway, fastapi websocket mcp, mcp pubsub, mcp openapi bridge
🤝 Contributing
Pull requests are welcome! File issues if you’d like to help improve:
- Performance
- Deployment
- SDK integrations
📄 License
MIT License
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










