- Explore MCP Servers
- localMCP
Localmcp
What is Localmcp
localMCP is a modern, full-featured Model Context Protocol (MCP) host that supports multiple large language models (LLMs) through a React frontend and FastAPI backend.
Use cases
Use cases for localMCP include developing chatbots, conducting AI experiments, providing interactive learning tools, and integrating LLMs into existing applications for enhanced functionality.
How to use
To use localMCP, clone the repository, run the setup script to install dependencies, and start the application. Optionally, configure LLM providers by creating a .env file with your API keys.
Key features
Key features include multi-LLM support (OpenAI, Anthropic, Ollama), universal MCP server support with various transport options, a modern web interface with rich chat capabilities, and advanced functionalities like intelligent tool use and real-time search.
Where to use
localMCP can be used in various fields such as software development, AI research, and educational applications where interaction with multiple LLMs is beneficial.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Localmcp
localMCP is a modern, full-featured Model Context Protocol (MCP) host that supports multiple large language models (LLMs) through a React frontend and FastAPI backend.
Use cases
Use cases for localMCP include developing chatbots, conducting AI experiments, providing interactive learning tools, and integrating LLMs into existing applications for enhanced functionality.
How to use
To use localMCP, clone the repository, run the setup script to install dependencies, and start the application. Optionally, configure LLM providers by creating a .env file with your API keys.
Key features
Key features include multi-LLM support (OpenAI, Anthropic, Ollama), universal MCP server support with various transport options, a modern web interface with rich chat capabilities, and advanced functionalities like intelligent tool use and real-time search.
Where to use
localMCP can be used in various fields such as software development, AI research, and educational applications where interaction with multiple LLMs is beneficial.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
Local MCP Host
A modern, full-featured Model Context Protocol (MCP) host with React frontend, FastAPI backend, and comprehensive multi-LLM support.
🚀 Features
Multi-LLM Support
- 🤖 OpenAI: GPT-4, GPT-4 Turbo, GPT-3.5 Turbo
- 🎯 Anthropic: Claude 3.5 Sonnet, Claude 3 Sonnet, Claude 3 Haiku
- 🦙 Ollama: Auto-discovers all local models
- 🔄 Real-time Model Switching: Change models mid-conversation
Universal MCP Server Support
- 🔌 STDIO & SSE Transports: Local and remote server connections
- 🛠️ Popular Servers: Filesystem, SQLite, Git, and more
- ➕ Add Servers via UI: No config file editing required
- ⚡ Auto-reconnection: Robust connection management
Modern Web Interface
- 💬 Rich Chat: Markdown rendering with syntax highlighting
- 📊 Tools Browser: Comprehensive tool discovery and documentation
- 🔧 Interactive Testing: Execute MCP tools with parameter forms
- 📱 Responsive Design: Works on desktop and mobile
Advanced Capabilities
- 🧠 Intelligent Tool Use: LLMs automatically discover and use available tools
- 💾 Persistent State: Chat history and model selection preserved across tabs
- 🔍 Real-time Search: Find tools across all connected servers
- 📋 Parameter Documentation: Auto-generated schema documentation
📋 Requirements
- Python 3.8+
- Node.js 16+
- npm or yarn
🚀 Quick Start
1. Clone and Setup
git clone <repository-url>
cd universal-mcp-host
2. Start the Application
./start.sh
This script will:
- Install Python and Node.js dependencies
- Start the FastAPI backend on port 8000
- Start the React frontend on port 3000
- Display live logs from both services
3. Configure LLM Providers (Optional)
Create config/.env
with your API keys:
# OpenAI (optional)
OPENAI_API_KEY=sk-your-openai-key
# Anthropic Claude (optional)
ANTHROPIC_API_KEY=sk-ant-your-claude-key
# Ollama (runs locally, no key needed)
OLLAMA_BASE_URL=http://localhost:11434
4. Access the Interface
- Frontend: http://localhost:3000
- Backend API: http://localhost:8000
- API Documentation: http://localhost:8000/docs
🔧 Adding MCP Servers
Via Web Interface (Recommended)
- Go to the Servers tab
- Click “Add Server”
- Fill in the server details
- Click “Add Server” to connect
Popular Server Examples
Filesystem Access:
- Command:
npx
- Args:
-y @modelcontextprotocol/server-filesystem /path/to/directory
SQLite Database:
- Command:
uvx
- Args:
mcp-server-sqlite --db-path /path/to/database.db
Git Repository:
- Command:
npx
- Args:
-y @modelcontextprotocol/server-git /path/to/repo
Brave Search:
- Command:
uvx
- Args:
mcp-server-brave-search
Manual Configuration
Edit config/servers.json
:
{
"servers": {
"filesystem": {
"type": "stdio",
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-filesystem",
"/tmp"
],
"description": "File system operations"
},
"sqlite": {
"type": "stdio",
"command": "uvx",
"args": [
"mcp-server-sqlite",
"--db-path",
"/tmp/test.db"
],
"description": "SQLite database access"
},
"github": {
"type": "sse",
"url": "https://your-github-mcp-server.com/sse",
"headers": {
"Authorization": "Bearer YOUR_TOKEN"
},
"description": "GitHub repository access"
}
}
}
💬 Using the Chat Interface
Basic Usage
- Select your preferred LLM model from the dropdown
- Start chatting! The AI can automatically discover and use MCP tools
- Your conversation and model selection persist across tab switches
Example Prompts
- “List files in the /tmp directory”
- “Read the contents of package.json”
- “Search the codebase for TODO comments”
- “Query the database for user records”
- “Create a new file with some content”
Advanced Features
- Markdown Support: Code blocks, lists, and formatting render properly
- Tool Introspection: Ask “What tools do you have access to?”
- Multi-step Operations: “Analyze all Python files and create a summary”
🛠️ Tools Management
Browse Tools Tab
- Complete Tool Listing: See all tools from all connected servers
- Parameter Documentation: View required/optional parameters with descriptions
- Server Organization: Tools grouped by server with visual indicators
- Quick Testing: Click “Try It” to jump to the execution interface
Execute Tools Tab
- Interactive Forms: Dynamic parameter forms based on tool schemas
- JSON Validation: Real-time validation of tool arguments
- Result Display: Formatted output with syntax highlighting
- Error Handling: Clear error messages with debugging info
🏗️ Architecture
┌─────────────────┐ ┌──────────────────┐ ┌─────────────────┐ │ React Frontend│ ── │ FastAPI Backend │ ── │ LLM Providers │ │ (Port 3000) │ │ (Port 8000) │ │ OpenAI/Claude │ └─────────────────┘ └──────────────────┘ └─────────────────┘ │ │ ▼ ▼ ┌──────────────────┐ ┌─────────────┐ │ Simple MCP Host │ │ Ollama │ │ │ │ (Local LLMs)│ └──────────────────┘ └─────────────┘ │ ▼ ┌──────────────────┐ │ MCP Servers │ │ ┌──────────────┐ │ │ │ Filesystem │ │ │ │ SQLite │ │ │ │ Git │ │ │ │ Custom... │ │ │ └──────────────┘ │ └──────────────────┘
📁 Project Structure
universal-mcp-host/ ├── frontend/ # React application │ ├── src/ │ │ ├── App.js # Main application component │ │ ├── ChatInterface.js # Chat UI with markdown support │ │ ├── MCPTools.js # Tools browser and executor │ │ ├── ModelSelector.js # LLM model selection │ │ └── AddServerModal.js # Server configuration modal │ └── package.json # Frontend dependencies │ ├── backend/ # FastAPI application │ ├── main.py # API server and endpoints │ ├── requirements.txt # Backend dependencies │ └── venv/ # Python virtual environment │ ├── src/ # Core MCP implementation │ ├── simple_mcp_host.py # Robust MCP client implementation │ ├── mcp_host.py # Original MCP host (fallback) │ └── llm_clients.py # Multi-provider LLM clients │ ├── config/ # Configuration files │ ├── servers.json # MCP server definitions │ ├── .env # Environment variables (create this) │ └── .env.example # Environment template │ ├── start.sh # Startup script └── README.md # This file
🔧 Development
Backend Development
cd backend
source venv/bin/activate
pip install -r requirements.txt
python main.py
Frontend Development
cd frontend
npm install
npm start
Adding New LLM Providers
- Extend
BaseLLMClient
insrc/llm_clients.py
- Add provider detection in
backend/main.py
- Update model selector UI in
frontend/src/ModelSelector.js
Adding New MCP Transports
- Extend transport support in
src/simple_mcp_host.py
- Update server configuration schema
- Add UI forms in
frontend/src/AddServerModal.js
🐛 Troubleshooting
Common Issues
MCP Server Won’t Connect:
- Check server command and arguments
- Verify the server executable is available
- Review server logs in the terminal
- Ensure proper permissions for file access
No LLM Models Available:
- Configure API keys in
config/.env
- For Ollama: ensure it’s running (
ollama serve
) - Check network connectivity for cloud providers
Tools Not Working:
- Verify MCP server is properly connected
- Check tool arguments match expected schema
- Review server permissions and access rights
Frontend Build Errors:
- Delete
node_modules
and runnpm install
- Check Node.js version (16+ required)
- Clear browser cache and reload
Debug Mode
Enable detailed logging:
export MCP_LOG_LEVEL=DEBUG
./start.sh
Reset Configuration
# Backup current config
cp config/servers.json config/servers.json.backup
# Reset to minimal config
echo '{"servers": {}}' > config/servers.json
🤝 Contributing
- Fork the repository
- Create a feature branch
- Make your changes with tests
- Submit a pull request
📚 Resources
- MCP Specification
- Official MCP Servers
- Community MCP Registry
- FastAPI Documentation
- React Documentation
📄 License
MIT License - see LICENSE file for details.
🎯 What Makes This Special
- Production Ready: Robust error handling, reconnection logic, and comprehensive logging
- User Friendly: No command-line required - manage everything through the web interface
- Developer Friendly: Clear architecture, extensive documentation, and easy to extend
- Multi-Modal: Works with any LLM provider and any MCP server
- Modern Stack: React + FastAPI + TypeScript-style development experience
Local MCP Host - The complete solution for MCP integration! 🚀✨
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.