MCP ExplorerExplorer

Deepseek R1

@66julienmartinon 12 days ago
58 MIT
FreeCommunity
AI Systems
#Deepseek#LLM
A Model Context Protocol (MCP) server implementation connecting Claude Desktop with DeepSeek's language models (R1/V3)

Overview

What is Deepseek R1

Deepseek R1 MCP Server is a Model Context Protocol server designed for the Deepseek R1 language model, which is optimized for reasoning tasks and can handle a context window of up to 8192 tokens. It is implemented using Node.js and TypeScript for better integration and stability with MCP servers.

Use cases

The Deepseek R1 MCP Server is suitable for various tasks, including advanced text generation, coding, data cleaning, translation, general conversation, and creative writing. It can be tailored for specific applications by adjusting the temperature parameter to control the randomness of the outputs.

How to use

To use the server, you need to clone the repository, install dependencies, set up an environment file with your Deepseek API key, and configure Claude Desktop to point to the server. You can adjust model selections and configuration parameters as needed before running the server in development or production mode.

Key features

Key features include support for two models (Deepseek R1 and DeepSeek V3), advanced text generation, configurable parameters such as max tokens and temperature, robust error handling with detailed messages, and full MCP protocol support.

Where to use

The server can be integrated into applications that require natural language processing capabilities, such as chatbots, data analysis tools, and creative applications. It is particularly useful in environments that support Claude Desktop and require efficient handling of large context sizes.

Content

Deepseek R1 MCP Server

A Model Context Protocol (MCP) server implementation for the Deepseek R1 language model. Deepseek R1 is a powerful language model optimized for reasoning tasks with a context window of 8192 tokens.

Why Node.js?
This implementation uses Node.js/TypeScript as it provides the most stable integration with MCP servers. The Node.js SDK offers better type safety, error handling, and compatibility with Claude Desktop.

Deepseek R1 Server MCP server

Quick Start

Installing manually

# Clone and install
git clone https://github.com/66julienmartin/MCP-server-Deepseek_R1.git
cd deepseek-r1-mcp
npm install

# Set up environment
cp .env.example .env  # Then add your API key

# Build and run
npm run build

Prerequisites

  • Node.js (v18 or higher)
  • npm
  • Claude Desktop
  • Deepseek API key

Model Selection

By default, this server uses the deepseek-R1 model. If you want to use DeepSeek-V3 instead, modify the model name in src/index.ts:

// For DeepSeek-R1 (default)
model: "deepseek-reasoner"

// For DeepSeek-V3
model: "deepseek-chat"

Project Structure

deepseek-r1-mcp/
├── src/
│   ├── index.ts             # Main server implementation
├── build/                   # Compiled files
│   ├── index.js
├── LICENSE
├── README.md
├── package.json
├── package-lock.json
└── tsconfig.json

Configuration

  1. Create a .env file:
DEEPSEEK_API_KEY=your-api-key-here
  1. Update Claude Desktop configuration:
{
  "mcpServers": {
    "deepseek_r1": {
      "command": "node",
      "args": [
        "/path/to/deepseek-r1-mcp/build/index.js"
      ],
      "env": {
        "DEEPSEEK_API_KEY": "your-api-key"
      }
    }
  }
}

Development

npm run dev     # Watch mode
npm run build   # Build for production

Features

  • Advanced text generation with Deepseek R1 (8192 token context window)
  • Configurable parameters (max_tokens, temperature)
  • Robust error handling with detailed error messages
  • Full MCP protocol support
  • Claude Desktop integration
  • Support for both DeepSeek-R1 and DeepSeek-V3 models

API Usage

{
  "name": "deepseek_r1",
  "arguments": {
    "prompt": "Your prompt here",
    "max_tokens": 8192,    // Maximum tokens to generate
    "temperature": 0.2     // Controls randomness
  }
}

The Temperature Parameter

The default value of temperature is 0.2.

Deepseek recommends setting the temperature according to your specific use case:

USE CASE TEMPERATURE EXAMPLE
Coding / Math 0.0 Code generation, mathematical calculations
Data Cleaning / Data Analysis 1.0 Data processing tasks
General Conversation 1.3 Chat and dialogue
Translation 1.3 Language translation
Creative Writing / Poetry 1.5 Story writing, poetry generation

Error Handling

The server provides detailed error messages for common issues:

  • API authentication errors
  • Invalid parameters
  • Rate limiting
  • Network issues

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

MIT

Tools

deepseek_r1
Generate text using DeepSeek R1 model

Comments