- Explore MCP Servers
- fastapi-with-fatmcp-agent
Fastapi With Fatmcp Agent
What is Fastapi With Fatmcp Agent
fastapi-with-fatmcp-agent is a modular application built on FastMCP, integrating an MCP server, FastAPI interface, and LLM Agent capabilities. It serves as a demonstration of how these components can work together in a cohesive manner.
Use cases
Use cases include developing web applications that require backend processing with LLMs, creating APIs for AI-driven services, and building modular applications that can easily integrate different functionalities.
How to use
To use fastapi-with-fatmcp-agent, you can run the MCP server independently using the command ‘python main.py --mode mcp’ to listen on port 8001, or start the FastAPI server with ‘python main.py --mode api’ to listen on port 8080. The FastAPI server connects to the MCP server via SSE for handling requests.
Key features
Key features include a modular architecture that separates functionalities into distinct packages, integration of FastAPI for HTTP API interactions, and support for LLM processing through dedicated modules.
Where to use
fastapi-with-fatmcp-agent can be used in various fields such as web development, AI applications, and any scenario requiring modular server-client architecture with real-time capabilities.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Fastapi With Fatmcp Agent
fastapi-with-fatmcp-agent is a modular application built on FastMCP, integrating an MCP server, FastAPI interface, and LLM Agent capabilities. It serves as a demonstration of how these components can work together in a cohesive manner.
Use cases
Use cases include developing web applications that require backend processing with LLMs, creating APIs for AI-driven services, and building modular applications that can easily integrate different functionalities.
How to use
To use fastapi-with-fatmcp-agent, you can run the MCP server independently using the command ‘python main.py --mode mcp’ to listen on port 8001, or start the FastAPI server with ‘python main.py --mode api’ to listen on port 8080. The FastAPI server connects to the MCP server via SSE for handling requests.
Key features
Key features include a modular architecture that separates functionalities into distinct packages, integration of FastAPI for HTTP API interactions, and support for LLM processing through dedicated modules.
Where to use
fastapi-with-fatmcp-agent can be used in various fields such as web development, AI applications, and any scenario requiring modular server-client architecture with real-time capabilities.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
FastMCP Integration Application Demo
This project demonstrates a modular application built with FastMCP, integrating an MCP server, a FastAPI interface, and LLM Agent capabilities.
Project Architecture
The project utilizes a modular design, separating different functionalities into distinct packages:
app/ ├── __init__.py ├── api/ # FastAPI Application Layer │ ├── __init__.py │ ├── main.py # FastAPI main application entry point │ └── routers/ # API route definitions │ ├── __init__.py │ ├── agent.py # Agent mode routes │ ├── mcp_resources.py # MCP resource routes │ └── mcp_tools.py # MCP tool routes ├── llm/ # LLM (Large Language Model) Processing Layer │ ├── __init__.py │ ├── base.py # Base LLM class │ └── openai.py # OpenAI implementation └── mcp_server/ # MCP Server Definition Layer ├── __init__.py ├── base.py # Base MCP Server class ├── run.py # Script for internal Client connection or direct execution └── simple.py # Simple MCP Server implementation (with tools & resources) main.py # Main entry point (runs API or MCP server)
Core Workflow (API Mode):
- Run MCP Server Independently: Start a separate MCP server process using
python main.py --mode mcp, listening on a specified port (default: 8001) with SSE transport. - Run API Server: Start the FastAPI server using
python main.py --mode api(listens on 8080). - Connection: The
mcp_clientwithin the FastAPI server connects to the independently running MCP server via SSE. - Request Handling: Frontend or other clients interact with the application through the HTTP API provided by FastAPI.
- Tool/Resource/Agent Calls: FastAPI routes forward requests to the
mcp_client(communicating with the MCP server) or thellmmodule (communicating with the LLM API).
Features
- Modular Design: Clear separation of concerns (API, LLM, MCP) for easy extension and maintenance.
- Dual Run Modes:
apimode: Runs the FastAPI server, requires a separately running MCP server.mcpmode: Runs the MCP server directly for testing or connection by other clients.
- LLM Integration: Supports using OpenAI (or other extensible LLMs) to process tool outputs or execute in Agent mode.
- Agent Mode: Provides an
/api/agent/processendpoint for the LLM to autonomously select and call MCP tools. - Complete API: Offers RESTful endpoints for MCP tools, resources, and Agent functionality via FastAPI.
- Persistent Connection: The API server maintains a long-lived SSE connection to the MCP server for efficiency.
Installation
-
Clone the repository (if needed).
-
Install dependencies (using uv or pip):
# Recommended: use uv uv pip install -e . # Alternatively, use pip # pip install -e . -
Set Environment Variables:
- The
OPENAI_API_KEYenvironment variable is required to use LLM features. - You can create a
.envfile in the project root and define it there:OPENAI_API_KEY=sk-...
- The
Usage
Running the API Server (Recommended)
This mode requires first starting the standalone MCP server.
-
Start the MCP Server (in one terminal):
# Uses SSE transport, listening on 127.0.0.1:8001 (default) python main.py --mode mcp # Or specify a different host and port # python main.py --mode mcp --mcp-host 0.0.0.0 --mcp-port 8002 -
Start the API Server (in another terminal):
# Listens on 0.0.0.0:8080 (default) python main.py --mode api # Use a different port or enable hot-reloading (for development) # python main.py --mode api --port 9000 --reloadThe API server is now accessible at
http://localhost:8080(or your specified port). It will automatically connect to the MCP server started in step 1 (default connection:http://localhost:8001/sse).
Running the MCP Server Directly
If you only need to run the MCP server (e.g., for direct connection by other FastMCP clients):
# Default: Use SSE transport, listening on 127.0.0.1:8001
python main.py --mode mcp
# Use Standard I/O (stdio) transport
# python main.py --mode mcp --mcp-transport stdio
API Endpoints (Accessed via API Server)
The API server runs on http://localhost:8080 (default).
- MCP Tools (
/api/tools):GET /: List all available tools and their parameters.POST /{tool_name}: Call a specific tool. Example request body:{"params": {"a": 5, "b": 3}, "use_llm": true, "system_message": "Explain the result"}
- MCP Resources (
/api/resources):GET /: List all available resource URIs and their types.GET /{resource_path:path}: Get the content of a specific resource (e.g.,GET /api/resources/example/greeting).
- Agent Mode (
/api/agent):POST /process: Let the LLM autonomously handle a user request, potentially calling tools. Example request body:{"prompt": "What is 5 plus 3?"}
- Health Checks:
GET /health: Check if the API server is running.GET /api/tools/health: Check the connection status between the API server and the MCP server.
Extending the Application
Adding New Tools
- Extend
BaseMCPServerinapp/mcp_server/simple.py(or create a new server file). - Define new tools using the
@self.mcp.tool()decorator within the_register_toolsmethod. - If you created a new server file, import and instantiate it in
app/mcp_server/run.py.
Adding New LLM Providers
- Create a new Python file in the
app/llm/directory (e.g.,anthropic.py). - Create a new class inheriting from
app.llm.base.BaseLLM. - Implement the
generateandgenerate_with_toolsmethods. - Import and use the new LLM class where needed (e.g., in
agent.py).
Development
Enable hot-reloading when running the API server for development:
# Ensure the standalone MCP server is still running
python main.py --mode api --reload
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










