- Explore MCP Servers
- mcp-ollama
Mcp Ollama
What is Mcp Ollama
mcp-ollama is a Model Context Protocol (MCP) server designed to integrate Ollama with Claude Desktop or other MCP clients, allowing users to query models seamlessly.
Use cases
Use cases for mcp-ollama include developing AI applications, conducting research on model performance, and integrating model querying capabilities into existing software solutions.
How to use
To use mcp-ollama, ensure you have Python 3.10 or higher and Ollama installed. Configure Claude Desktop by adding the mcp-ollama server settings to the configuration file. You can then interact with the server using commands like ‘list_models’, ‘show_model’, and ‘ask_model’.
Key features
Key features of mcp-ollama include the ability to list all downloaded Ollama models, retrieve detailed information about specific models, and pose questions to the models.
Where to use
mcp-ollama can be used in various fields such as AI research, software development, and any application requiring model querying and interaction.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Mcp Ollama
mcp-ollama is a Model Context Protocol (MCP) server designed to integrate Ollama with Claude Desktop or other MCP clients, allowing users to query models seamlessly.
Use cases
Use cases for mcp-ollama include developing AI applications, conducting research on model performance, and integrating model querying capabilities into existing software solutions.
How to use
To use mcp-ollama, ensure you have Python 3.10 or higher and Ollama installed. Configure Claude Desktop by adding the mcp-ollama server settings to the configuration file. You can then interact with the server using commands like ‘list_models’, ‘show_model’, and ‘ask_model’.
Key features
Key features of mcp-ollama include the ability to list all downloaded Ollama models, retrieve detailed information about specific models, and pose questions to the models.
Where to use
mcp-ollama can be used in various fields such as AI research, software development, and any application requiring model querying and interaction.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
MCP Ollama
A Model Context Protocol (MCP) server for integrating Ollama with Claude Desktop or other MCP clients.
Requirements
- Python 3.10 or higher
- Ollama installed and running (https://ollama.com/download)
- At least one model pulled with Ollama (e.g.,
ollama pull llama2)
Configure Claude Desktop
Add to your Claude Desktop configuration (~/Library/Application Support/Claude/claude_desktop_config.json on macOS, %APPDATA%\Claude\claude_desktop_config.json on Windows):
{
"mcpServers": {
"ollama": {
"command": "uvx",
"args": [
"mcp-ollama"
]
}
}
}
Development
Install in development mode:
git clone https://github.com/yourusername/mcp-ollama.git
cd mcp-ollama
uv sync
Test with MCP Inspector:
mcp dev src/mcp_ollama/server.py
Features
The server provides four main tools:
list_models- List all downloaded Ollama modelsshow_model- Get detailed information about a specific modelask_model- Ask a question to a specified model
License
MIT
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










