- Explore MCP Servers
- Ollama-MCP-Bridge-WebUI
Ollama Mcp Bridge Webui
What is Ollama Mcp Bridge Webui
Ollama-MCP-Bridge-WebUI is a web interface that connects local Ollama LLMs to Model Context Protocol (MCP) servers, allowing open-source models to perform file operations, web searches, and reasoning tasks similar to commercial AI assistants, all on private hardware.
Use cases
Use cases include creating custom AI assistants for specific tasks, enhancing local applications with AI capabilities, performing data analysis, and enabling advanced reasoning tasks in a secure environment.
How to use
To use Ollama-MCP-Bridge-WebUI, you can either run the automatic installation script or set it up manually. The automatic script installs necessary dependencies, sets up the workspace, and configures the environment. For manual setup, install Ollama, pull the Qwen model, install dependencies, create a workspace, configure API keys, and build the project.
Key features
Key features include multi-MCP integration for connecting multiple servers, automatic tool detection based on user queries, a clean web interface with collapsible tool descriptions, and a comprehensive toolset that includes filesystem access, web search, and reasoning capabilities.
Where to use
Ollama-MCP-Bridge-WebUI can be used in various fields such as AI research, software development, and any domain requiring local AI assistance with privacy and control over data.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Ollama Mcp Bridge Webui
Ollama-MCP-Bridge-WebUI is a web interface that connects local Ollama LLMs to Model Context Protocol (MCP) servers, allowing open-source models to perform file operations, web searches, and reasoning tasks similar to commercial AI assistants, all on private hardware.
Use cases
Use cases include creating custom AI assistants for specific tasks, enhancing local applications with AI capabilities, performing data analysis, and enabling advanced reasoning tasks in a secure environment.
How to use
To use Ollama-MCP-Bridge-WebUI, you can either run the automatic installation script or set it up manually. The automatic script installs necessary dependencies, sets up the workspace, and configures the environment. For manual setup, install Ollama, pull the Qwen model, install dependencies, create a workspace, configure API keys, and build the project.
Key features
Key features include multi-MCP integration for connecting multiple servers, automatic tool detection based on user queries, a clean web interface with collapsible tool descriptions, and a comprehensive toolset that includes filesystem access, web search, and reasoning capabilities.
Where to use
Ollama-MCP-Bridge-WebUI can be used in various fields such as AI research, software development, and any domain requiring local AI assistance with privacy and control over data.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
Ollama-MCP Bridge WebUI
A TypeScript implementation that connects local LLMs (via Ollama) to Model Context Protocol (MCP) servers with a web interface. This bridge allows open-source models to use the same tools and capabilities as Claude, enabling powerful local AI assistants that run entirely on your own hardware.
Features
- Multi-MCP Integration: Connect multiple MCP servers simultaneously
- Tool Detection: Automatically identifies which tool to use based on queries
- Web Interface: Clean UI with collapsible tool descriptions
- Comprehensive Toolset: Filesystem, web search, and reasoning capabilities
Setup
Automatic Installation
The easiest way to set up the bridge is to use the included installation script:
./install.bat
This script will:
- Check for and install Node.js if needed
- Check for and install Ollama if needed
- Install all dependencies
- Create the workspace directory (…/workspace)
- Set up initial configuration
- Build the TypeScript project
- Download the Qwen model for Ollama
After running the script, you only need to:
- Add your API keys to the
.envfile (the$VARIABLE_NAMEreferences in the config will be replaced with actual values)
Manual Setup
If you prefer to set up manually:
- Install Ollama from ollama.com/download
- Pull the Qwen model:
ollama pull qwen2.5-coder:7b-instruct-q4_K_M - Install dependencies:
npm install - Create a workspace directory:
mkdir ../workspace - Configure API keys in
.env - Build the project:
npm run build
Configuration
The bridge is configured through two main files:
1. bridge_config.json
This file defines MCP servers, LLM settings, and system prompt. Environment variables are referenced with $VARIABLE_NAME syntax.
Example:
{
"mcpServers": {
"filesystem": {
"command": "node",
"args": [
"To/Your/Directory/Ollama-MCP-Bridge-WebUI/node_modules/@modelcontextprotocol/server-filesystem/dist/index.js",
"To/Your/Directory/Ollama-MCP-Bridge-WebUI/../workspace"
],
"allowedDirectory": "To/Your/Directory/Ollama-MCP-Bridge-WebUI/../workspace"
},
"brave-search": {
"command": "node",
"args": [
"To/Your/Directory/Ollama-MCP-Bridge-WebUI/node_modules/@modelcontextprotocol/server-brave-search/dist/index.js"
],
"env": {
"BRAVE_API_KEY": "$BRAVE_API_KEY"
}
},
"sequential-thinking": {
"command": "node",
"args": [
"To/Your/Directory/Ollama-MCP-Bridge-WebUI/node_modules/@modelcontextprotocol/server-sequential-thinking/dist/index.js"
]
}
},
"llm": {
"model": "qwen2.5-coder:7b-instruct-q4_K_M",
"baseUrl": "http://localhost:11434",
"apiKey": "ollama",
"temperature": 0.7,
"maxTokens": 8000
},
"systemPrompt": "You are a helpful assistant that can use various tools to help answer questions. You have access to three main tool groups: 1) Filesystem operations - for working with files and directories, 2) Brave search - for finding information on the web, 3) Sequential thinking for complex problem-solving. When a user asks a question that requires external information, real-time data, or file manipulation, you should use a tool rather than guessing or using only your pre-trained knowledge."
}
2. .env file
This file stores sensitive information like API keys:
# Brave Search API key BRAVE_API_KEY=your_brave_key_here
The bridge will automatically replace $BRAVE_API_KEY in the configuration with the actual value from your .env file.
Usage
Starting the Bridge
Simply run:
./start.bat
This will start the bridge with the web interface.
Web Interface
Open http://localhost:8080 (or the port shown in the console) in your browser to access the web interface.
License
MIT License - See LICENSE file for details
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










