MCP ExplorerExplorer

Simple Mcp Ollama Bridge

@virajsharma2000on 10 months ago
1 MIT
FreeCommunity
AI Systems
A Simple bridge from Ollama to a fetch url mcp server

Overview

What is Simple Mcp Ollama Bridge

simple-mcp-ollama-bridge is a bridge that connects Model Context Protocol (MCP) servers to OpenAI-compatible large language models (LLMs) like Ollama, facilitating seamless communication between them.

Use cases

Use cases include developing chatbots, enhancing language understanding in applications, and creating AI-driven content generation tools that leverage the capabilities of both MCP servers and LLMs.

How to use

To use simple-mcp-ollama-bridge, install the necessary dependencies using the provided installation script, clone the repository, and configure the bridge parameters in the main.py file. Ensure to set the correct paths and API keys as needed.

Key features

Key features include compatibility with OpenAI API specifications, support for various LLMs, and a straightforward setup process that allows users to easily connect MCP servers with local models.

Where to use

simple-mcp-ollama-bridge can be used in fields such as artificial intelligence, natural language processing, and any application that requires integration between MCP servers and LLMs.

Content

MCP LLM Bridge

A bridge connecting Model Context Protocol (MCP) servers to OpenAI-compatible LLMs like Ollama
Read more about MCP by Anthropic here:

Quick Start

# Install
curl -LsSf https://astral.sh/uv/install.sh | sh
git clone https://github.com/bartolli/mcp-llm-bridge.git
cd mcp-llm-bridge
uv venv
source .venv/bin/activate
uv pip install -e .


Note: reactivate the environment if needed to use the keys in `.env`: `source .venv/bin/activate`

Then configure the bridge in [src/mcp_llm_bridge/main.py](https://raw.githubusercontent.com/virajsharma2000/simple-mcp-ollama-bridge/master/src/mcp_llm_bridge/main.py)

```python
 mcp_server_params=StdioServerParameters(
            command="uv",
            # CHANGE THIS = it needs to be an absolute directory! add the mcp fetch server at the directory (clone from https://github.com/modelcontextprotocol/servers/)
            args=["--directory", "~/llms/mcp/mc-server-fetch/servers/src/fetch", "run", "mcp-server-fetch"],
            env=None
        ),
        # llm_config=LLMConfig(
        #     api_key=os.getenv("OPENAI_API_KEY"),
        #     model=os.getenv("OPENAI_MODEL", "gpt-4o"),
        #     base_url=None
        # ),
        llm_config=LLMConfig(
            api_key="ollama",  # Can be any string for local testing
            model="llama3.2",
            base_url="http://localhost:11434/v1"  # Point to your local model's endpoint
        ),
)

Additional Endpoint Support

The bridge also works with any endpoint implementing the OpenAI API specification:

Ollama

llm_config=LLMConfig(
    api_key="not-needed",
    model="mistral-nemo:12b-instruct-2407-q8_0",
    base_url="http://localhost:11434/v1"
)

License

MIT

Contributing

PRs welcome.

Tools

No tools

Comments

Recommend MCP Servers

View All MCP Servers