- Explore MCP Servers
- nano-MCP
Nano Mcp
What is Nano Mcp
nano-MCP is an implementation of the Model Context Protocol (MCP) designed for tool-augmented Language Models (LLMs), enabling models like GPT-4.1 and Llama to autonomously utilize external tools for task completion.
Use cases
Use cases for nano-MCP include automating file management tasks, executing version control commands through LLMs, and enhancing user interactions with language models by enabling them to access and manipulate external resources.
How to use
To use nano-MCP, set up the Docker environment by creating a ‘shared_data’ folder and running ‘docker-compose up -d’ in the ‘src/servers/’ directory. Then, install the required Python packages and start the MCP Client Hub by configuring environment variables and executing ‘python nano_mcp_client.py’. You can interact with the system via the CLI or web interface.
Key features
Key features of nano-MCP include LLM integration for seamless API interaction, a structured MCP client that routes tool calls, and Dockerized FastAPI applications for file management and version control functionalities.
Where to use
nano-MCP can be used in various fields such as software development for version control, data management for file operations, and any domain requiring enhanced LLM capabilities through external tool integration.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Nano Mcp
nano-MCP is an implementation of the Model Context Protocol (MCP) designed for tool-augmented Language Models (LLMs), enabling models like GPT-4.1 and Llama to autonomously utilize external tools for task completion.
Use cases
Use cases for nano-MCP include automating file management tasks, executing version control commands through LLMs, and enhancing user interactions with language models by enabling them to access and manipulate external resources.
How to use
To use nano-MCP, set up the Docker environment by creating a ‘shared_data’ folder and running ‘docker-compose up -d’ in the ‘src/servers/’ directory. Then, install the required Python packages and start the MCP Client Hub by configuring environment variables and executing ‘python nano_mcp_client.py’. You can interact with the system via the CLI or web interface.
Key features
Key features of nano-MCP include LLM integration for seamless API interaction, a structured MCP client that routes tool calls, and Dockerized FastAPI applications for file management and version control functionalities.
Where to use
nano-MCP can be used in various fields such as software development for version control, data management for file operations, and any domain requiring enhanced LLM capabilities through external tool integration.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
nano-MCP: Model Context Protocol Implementation
An implementation of the Model Context Protocol (MCP) for tool-augmented Language Models, enabling LLMs like GPT-4.1 and Llama models to use external tools to complete tasks autonomously.
Architecture
This project follows the Model Context Protocol (MCP) architectural pattern:
- LLM Integration: Logic for interacting with LLM APIs (OpenAI, Claude, etc.). Sends prompts, parses tool call outputs, and routes tool calls to MCP client.
- MCP Client: Routes tool calls to the appropriate MCP servers.
- MCP Servers: Dockerized FastAPI applications exposing tool APIs:
- File Management Server (read, write, tree operations)
- Version Control Server (git operations, command execution)
Components
src/llm/llm_mcp_client.py: The core client for LLM integration with MCP toolssrc/llm/host_cli.py: Command-line interface for interacting with the LLM+MCP systemsrc/llm/host.py: FastAPI server providing a web interface at/
Usage
Docker Setup to run MCP Servers
Create a shared_data folder in the servers folder.
The project uses Docker Compose to orchestrate the MCP servers:
cd src/servers/
docker-compose up -d
This will start:
- File Management Server on port 8000
- Version Control Server on port 8001
MCP Client Hub
Install Requirements
# activate your environment cd src/ pip install -r requirements.txt
Start the MCP Client Hub
cd src/client
export MCP_SERVER_URLS="http://localhost:8002,http://localhost:8003"
python nano_mcp_client.py
MCP CLI or UI
Environment Variables
Configure the following environment variables:
OPENAI_API_KEY=your_api_key OPENAI_BASE_URL=https://api.openai.com/v1 MODEL_NAME=gpt-4-turbo HOST_MODEL=openai MCP_CLIENT_URL=http://localhost:8001
CLI Interface
Run the CLI interface:
cd src/llm
python host_cli.py
You can also provide configuration via command-line arguments:
python host_cli.py --api-key YOUR_API_KEY --model gpt-4-turbo --mcp-url http://localhost:8001
Web Interface
Start the web server:
cd src/llm
python host.py
The web interface will be available at http://localhost:7899/
Protocol Details
The Model Context Protocol (MCP) allows LLMs to:
- Discover available tools from MCP servers
- Call tools with proper arguments
- Receive and process tool results
- Continue execution in an autonomous loop
Each MCP server exposes two main endpoints:
GET /list/tools: Lists available tools with schemasPOST /execute/call: Executes tools based on name and input
Development
To add new tools, create a new server in the servers/ directory implementing the MCP protocol endpoints.
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










