- Explore MCP Servers
- mcp-deepseek-agent
Mcp Deepseek Agent
Build a mcp client using locally deepseek with ollama
Installations
{}
Arguments
{}
Examples
{}
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
View More MCP Clients
Overview
Installations
{}
Arguments
{}
Examples
{}
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
View More MCP Clients
Content
MCP Deepseek Agent
MCP server implementation using Ollama’s Deepseek model for seamless AI integration.
Features
- 🤖 MCP protocol compliance
- 🔄 Ollama integration with Deepseek model
- ⚙️ Automatic configuration
- 🧹 Clean responses (removes thinking tags)
- 📝 Standard MCP protocol endpoints
Quick Start
- Install Ollama and Deepseek model:
ollama run deepseek-r1
- Install the package:
pip install git+https://github.com/freebeiro/mcp-deepseek-agent.git
- Start the server:
mcp-deepseek-agent
Configuration
Set through environment variables:
export OLLAMA_API_URL="http://localhost:11434"
export OLLAMA_MODEL="deepseek-r1"
export TEMPERATURE="0.7"
export TOP_P="0.9"
export MCP_PORT="8080"
Usage in MCP Configuration
Add to your MCP configuration:
{
"mcpServers": {
"deepseek": {
"command": "mcp-deepseek-agent",
"args": [],
"env": {
"OLLAMA_MODEL": "deepseek-r1"
}
}
}
}
Documentation
See DOCUMENTATION.md for detailed usage and API documentation.
License
MIT License - see LICENSE file for details.
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.
View More MCP Dev Tools
Tools
No tools










