- Explore MCP Servers
- ollama-mcp
Ollama Mcp
What is Ollama Mcp
Ollama MCP is a Python-based client designed to connect to the Ollama large language model and the MCP (Model Context Protocol) server, facilitating seamless interaction between the model and external tools.
Use cases
Use cases for Ollama MCP include building chatbots, developing AI-driven applications, conducting data analysis with natural language queries, and integrating language models into existing software solutions.
How to use
To use Ollama MCP, first ensure you have Python 3.12+, Ollama CLI, and Java 21+. Clone the repository, install the required packages, configure the MCP server and Ollama model in the respective JSON and Python files, and run the main script using Python.
Key features
Key features include integration with the Ollama large language model (default is qwen3:14b), support for Server-Sent Events (SSE) for asynchronous communication, an extensible tool invocation system, an interactive command-line interface, and simple format conversion between Spring AI’s tools_list format and Ollama format.
Where to use
Ollama MCP can be used in various fields such as natural language processing, AI development, and any application requiring interaction between large language models and external tools.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Ollama Mcp
Ollama MCP is a Python-based client designed to connect to the Ollama large language model and the MCP (Model Context Protocol) server, facilitating seamless interaction between the model and external tools.
Use cases
Use cases for Ollama MCP include building chatbots, developing AI-driven applications, conducting data analysis with natural language queries, and integrating language models into existing software solutions.
How to use
To use Ollama MCP, first ensure you have Python 3.12+, Ollama CLI, and Java 21+. Clone the repository, install the required packages, configure the MCP server and Ollama model in the respective JSON and Python files, and run the main script using Python.
Key features
Key features include integration with the Ollama large language model (default is qwen3:14b), support for Server-Sent Events (SSE) for asynchronous communication, an extensible tool invocation system, an interactive command-line interface, and simple format conversion between Spring AI’s tools_list format and Ollama format.
Where to use
Ollama MCP can be used in various fields such as natural language processing, AI development, and any application requiring interaction between large language models and external tools.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
Ollama MCP Client
一个基于Python的客户端实现,用于连接Ollama大语言模型和MCP(Model Context Protocol)服务器,实现模型与外部工具的无缝交互。
特性
- 集成Ollama大语言模型(默认使用qwen3:14b)
- 支持SSE(Server-Sent Events)异步通信
- 可扩展的工具调用系统
- 交互式命令行界面
- 简洁的工具格式转换(Spring AI的 tools_list格式 ⟷ Ollama格式)
快速开始
环境要求
- Python 3.12+
- Ollama CLI(安装方法请参考 Ollama官网)
- java 21+
安装
git clone https://github.com/Shlysz/ollama-mcp.git
cd Ollama-mcp-client
pip install -r requirements.txt
配置
- 在
client/mcp_server_config.json配置我的MCP服务器,我的mcp服务器发布在release中,格式如下:
{
"mcpServers": {
"server-name": {
"url": "http://localhost:8080/sse"
}
}
}
如果你有自己的mcp-server 也可以配置自己的服务器地址。
- 在
client/Constants.py配置Ollama模型:
LLAMA_MODEL_QWEN = "qwen3:14b" # 或其他Ollama支持的模型
运行
python client/main.py
项目结构
Ollama-MCP-Client ├── client │ ├── Constants.py │ ├── McpClient.py │ ├── OllamaAgent.py │ ├── OllamaTools.py │ ├── __init__.py │ ├── main.py │ ├── mcp_server_config.json │ └── utils │ ├── JsonUtil.py ├── requirements.txt └── test ├── __init__.py └── test_ollamaToolsformat.py
致谢
本项目受到 mihirrd/ollama-mcp-client 的启发,它的项目只支持本地不支持sse,本人的相反,特此感谢。
许可证
MIT License
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










