- Explore MCP Servers
- MCP-server-Deepseek_R1
Mcp Server Deepseek R1
What is Mcp Server Deepseek R1
MCP-server-Deepseek_R1 is an implementation of a Model Context Protocol (MCP) server that connects Claude Desktop with DeepSeek’s language models, specifically the R1 and V3 versions. It is optimized for reasoning tasks and supports a context window of 8192 tokens.
Use cases
Use cases for MCP-server-Deepseek_R1 include coding assistance, data analysis, general conversation, translation, and creative writing. The model’s temperature parameter can be adjusted based on the specific task requirements.
How to use
To use MCP-server-Deepseek_R1, clone the repository, install the necessary dependencies using npm, set up your environment with your Deepseek API key, and run the server. You can also configure the model type in the source code if needed.
Key features
Key features include advanced text generation capabilities, configurable parameters such as max_tokens and temperature, robust error handling, full MCP protocol support, and integration with Claude Desktop. It supports both DeepSeek-R1 and DeepSeek-V3 models.
Where to use
undefined
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Mcp Server Deepseek R1
MCP-server-Deepseek_R1 is an implementation of a Model Context Protocol (MCP) server that connects Claude Desktop with DeepSeek’s language models, specifically the R1 and V3 versions. It is optimized for reasoning tasks and supports a context window of 8192 tokens.
Use cases
Use cases for MCP-server-Deepseek_R1 include coding assistance, data analysis, general conversation, translation, and creative writing. The model’s temperature parameter can be adjusted based on the specific task requirements.
How to use
To use MCP-server-Deepseek_R1, clone the repository, install the necessary dependencies using npm, set up your environment with your Deepseek API key, and run the server. You can also configure the model type in the source code if needed.
Key features
Key features include advanced text generation capabilities, configurable parameters such as max_tokens and temperature, robust error handling, full MCP protocol support, and integration with Claude Desktop. It supports both DeepSeek-R1 and DeepSeek-V3 models.
Where to use
undefined
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
Deepseek R1 MCP Server
A Model Context Protocol (MCP) server implementation for the Deepseek R1 language model. Deepseek R1 is a powerful language model optimized for reasoning tasks with a context window of 8192 tokens.
Why Node.js?
This implementation uses Node.js/TypeScript as it provides the most stable integration with MCP servers. The Node.js SDK offers better type safety, error handling, and compatibility with Claude Desktop.
Quick Start
Installing manually
# Clone and install
git clone https://github.com/66julienmartin/MCP-server-Deepseek_R1.git
cd deepseek-r1-mcp
npm install
# Set up environment
cp .env.example .env # Then add your API key
# Build and run
npm run build
Prerequisites
- Node.js (v18 or higher)
- npm
- Claude Desktop
- Deepseek API key
Model Selection
By default, this server uses the deepseek-R1 model. If you want to use DeepSeek-V3 instead, modify the model name in src/index.ts:
// For DeepSeek-R1 (default)
model: "deepseek-reasoner"
// For DeepSeek-V3
model: "deepseek-chat"
Project Structure
deepseek-r1-mcp/ ├── src/ │ ├── index.ts # Main server implementation ├── build/ # Compiled files │ ├── index.js ├── LICENSE ├── README.md ├── package.json ├── package-lock.json └── tsconfig.json
Configuration
- Create a
.envfile:
DEEPSEEK_API_KEY=your-api-key-here
- Update Claude Desktop configuration:
{
"mcpServers": {
"deepseek_r1": {
"command": "node",
"args": [
"/path/to/deepseek-r1-mcp/build/index.js"
],
"env": {
"DEEPSEEK_API_KEY": "your-api-key"
}
}
}
}
Development
npm run dev # Watch mode
npm run build # Build for production
Features
- Advanced text generation with Deepseek R1 (8192 token context window)
- Configurable parameters (max_tokens, temperature)
- Robust error handling with detailed error messages
- Full MCP protocol support
- Claude Desktop integration
- Support for both DeepSeek-R1 and DeepSeek-V3 models
API Usage
{
"name": "deepseek_r1",
"arguments": {
"prompt": "Your prompt here",
"max_tokens": 8192, // Maximum tokens to generate
"temperature": 0.2 // Controls randomness
}
}
The Temperature Parameter
The default value of temperature is 0.2.
Deepseek recommends setting the temperature according to your specific use case:
| USE CASE | TEMPERATURE | EXAMPLE |
|---|---|---|
| Coding / Math | 0.0 | Code generation, mathematical calculations |
| Data Cleaning / Data Analysis | 1.0 | Data processing tasks |
| General Conversation | 1.3 | Chat and dialogue |
| Translation | 1.3 | Language translation |
| Creative Writing / Poetry | 1.5 | Story writing, poetry generation |
Error Handling
The server provides detailed error messages for common issues:
- API authentication errors
- Invalid parameters
- Rate limiting
- Network issues
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
MIT
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










