MCP ExplorerExplorer

Ollama Mcp

@NightTrekon 9 months ago
56 MIT
FreeCommunity
AI Systems
Ollama MCP is the bridge that integrates local LLM (Large Language Model) into MCP applications.

Overview

What is Ollama Mcp

Ollama-mcp is a powerful server that acts as a bridge between Ollama and the Model Context Protocol (MCP), allowing seamless integration of Ollama’s local LLM capabilities into MCP-powered applications.

Use cases

Use cases for Ollama-mcp include deploying local AI models for chat applications, managing custom models for specific tasks, and integrating local LLM capabilities into existing MCP-powered applications.

How to use

To use Ollama-mcp, install the necessary dependencies, build the server, and configure it in your MCP settings. You can then pull and run models using the provided API.

Key features

Key features include complete Ollama integration with full API coverage, OpenAI-compatible chat functionality, local LLM execution, model management capabilities, and server control options.

Where to use

Ollama-mcp can be used in various fields such as AI development, natural language processing, chatbots, and any application requiring local model execution and management.

Content

Ollama MCP Server

🚀 A powerful bridge between Ollama and the Model Context Protocol (MCP), enabling seamless integration of Ollama’s local LLM capabilities into your MCP-powered applications.

🌟 Features

Complete Ollama Integration

  • Full API Coverage: Access all essential Ollama functionality through a clean MCP interface
  • OpenAI-Compatible Chat: Drop-in replacement for OpenAI’s chat completion API
  • Local LLM Power: Run AI models locally with full control and privacy

Core Capabilities

  • 🔄 Model Management

    • Pull models from registries
    • Push models to registries
    • List available models
    • Create custom models from Modelfiles
    • Copy and remove models
  • 🤖 Model Execution

    • Run models with customizable prompts
    • Chat completion API with system/user/assistant roles
    • Configurable parameters (temperature, timeout)
    • Raw mode support for direct responses
  • 🛠 Server Control

    • Start and manage Ollama server
    • View detailed model information
    • Error handling and timeout management

🚀 Getting Started

Prerequisites

  • Ollama installed on your system
  • Node.js and npm/pnpm

Installation

  1. Install dependencies:
pnpm install
  1. Build the server:
pnpm run build

Configuration

Add the server to your MCP configuration:

For Claude Desktop:

MacOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%/Claude/claude_desktop_config.json

🛠 Usage Examples

Pull and Run a Model

// Pull a model
await mcp.use_mcp_tool({
  server_name: "ollama",
  tool_name: "pull",
  arguments: {
    name: "llama2"
  }
});

// Run the model
await mcp.use_mcp_tool({
  server_name: "ollama",
  tool_name: "run",
  arguments: {
    name: "llama2",
    prompt: "Explain quantum computing in simple terms"
  }
});

Chat Completion (OpenAI-compatible)

await mcp.use_mcp_tool({
  server_name: "ollama",
  tool_name: "chat_completion",
  arguments: {
    model: "llama2",
    messages: [
      {
        role: "system",
        content: "You are a helpful assistant."
      },
      {
        role: "user",
        content: "What is the meaning of life?"
      }
    ],
    temperature: 0.7
  }
});

Create Custom Model

await mcp.use_mcp_tool({
  server_name: "ollama",
  tool_name: "create",
  arguments: {
    name: "custom-model",
    modelfile: "./path/to/Modelfile"
  }
});

🔧 Advanced Configuration

  • OLLAMA_HOST: Configure custom Ollama API endpoint (default: http://127.0.0.1:11434)
  • Timeout settings for model execution (default: 60 seconds)
  • Temperature control for response randomness (0-2 range)

🤝 Contributing

Contributions are welcome! Feel free to:

  • Report bugs
  • Suggest new features
  • Submit pull requests

📝 License

MIT License - feel free to use in your own projects!


Built with ❤️ for the MCP ecosystem

Tools

No tools

Comments

Recommend MCP Servers

View All MCP Servers