- Explore MCP Servers
- deepseek-thinker-mcp
Deepseek Thinker
What is Deepseek Thinker
The Deepseek Thinker MCP Server is a Model Context Protocol (MCP) provider that enables MCP-enabled AI clients, such as Claude Desktop, to access and utilize Deepseek’s reasoning capabilities through an API service or a local Ollama server.
Use cases
The server can be used for various applications requiring focused reasoning and structured responses in AI interactions. It supports both cloud-based OpenAI API requests and local reasoning via Ollama, making it versatile for personal and professional AI use.
How to use
To integrate with compatible AI clients, you can modify the configuration files (like claude_desktop_config.json
) by specifying the MCP server parameters, including command, arguments, and environmental variables. You can also set up the server locally or use it in different modes depending on the desired operating environment.
Key features
Key features include dual mode support for OpenAI API and Ollama, focused reasoning that captures Deepseek’s thought processes, and structured output that enhances the clarity and utility of the responses.
Where to use
The Deepseek Thinker MCP Server is useful in environments where enhanced AI reasoning is needed, such as research, content generation, development, and personal projects that leverage advanced AI capabilities and require structured reasoning outputs.
Overview
What is Deepseek Thinker
The Deepseek Thinker MCP Server is a Model Context Protocol (MCP) provider that enables MCP-enabled AI clients, such as Claude Desktop, to access and utilize Deepseek’s reasoning capabilities through an API service or a local Ollama server.
Use cases
The server can be used for various applications requiring focused reasoning and structured responses in AI interactions. It supports both cloud-based OpenAI API requests and local reasoning via Ollama, making it versatile for personal and professional AI use.
How to use
To integrate with compatible AI clients, you can modify the configuration files (like claude_desktop_config.json
) by specifying the MCP server parameters, including command, arguments, and environmental variables. You can also set up the server locally or use it in different modes depending on the desired operating environment.
Key features
Key features include dual mode support for OpenAI API and Ollama, focused reasoning that captures Deepseek’s thought processes, and structured output that enhances the clarity and utility of the responses.
Where to use
The Deepseek Thinker MCP Server is useful in environments where enhanced AI reasoning is needed, such as research, content generation, development, and personal projects that leverage advanced AI capabilities and require structured reasoning outputs.
Content
Deepseek Thinker MCP Server
A MCP (Model Context Protocol) provider Deepseek reasoning content to MCP-enabled AI Clients, like Claude Desktop. Supports access to Deepseek’s thought processes from the Deepseek API service or from a local Ollama server.
Core Features
-
🤖 Dual Mode Support
- OpenAI API mode support
- Ollama local mode support
-
🎯 Focused Reasoning
- Captures Deepseek’s thinking process
- Provides reasoning output
Available Tools
get-deepseek-thinker
- Description: Perform reasoning using the Deepseek model
- Input Parameters:
originPrompt
(string): User’s original prompt
- Returns: Structured text response containing the reasoning process
Environment Configuration
OpenAI API Mode
Set the following environment variables:
API_KEY=<Your OpenAI API Key> BASE_URL=<API Base URL>
Ollama Mode
Set the following environment variable:
USE_OLLAMA=true
Usage
Integration with AI Client, like Claude Desktop
Add the following configuration to your claude_desktop_config.json
:
{
"mcpServers": {
"deepseek-thinker": {
"command": "npx",
"args": [
"-y",
"deepseek-thinker-mcp"
],
"env": {
"API_KEY": "<Your API Key>",
"BASE_URL": "<Your Base URL>"
}
}
}
}
Using Ollama Mode
{
"mcpServers": {
"deepseek-thinker": {
"command": "npx",
"args": [
"-y",
"deepseek-thinker-mcp"
],
"env": {
"USE_OLLAMA": "true"
}
}
}
}
Local Server Configuration
{
"mcpServers": {
"deepseek-thinker": {
"command": "node",
"args": [
"/your-path/deepseek-thinker-mcp/build/index.js"
],
"env": {
"API_KEY": "<Your API Key>",
"BASE_URL": "<Your Base URL>"
}
}
}
}
Development Setup
# Install dependencies
npm install
# Build project
npm run build
# Run service
node build/index.js
FAQ
Response like this: “MCP error -32001: Request timed out”
This error occurs when the Deepseek API response is too slow or when the reasoning content output is too long, causing the MCP server to timeout.
Tech Stack
- TypeScript
- @modelcontextprotocol/sdk
- OpenAI API
- Ollama
- Zod (parameter validation)
License
This project is licensed under the MIT License. See the LICENSE file for details.