- Explore MCP Servers
- mcp-server-dify
Mcp Server Dify
What is Mcp Server Dify
mcp-server-dify is a Model Context Protocol Server designed for Dify AI, enabling large language models (LLMs) to interact with Dify AI’s chat completion capabilities through a standardized protocol.
Use cases
Use cases for mcp-server-dify include enhancing chatbots with restaurant recommendations, maintaining conversation context in dialogues, and facilitating real-time interactions with Dify AI’s chat capabilities.
How to use
To use mcp-server-dify, install it via NPM with the command ‘npm install @modelcontextprotocol/server-dify’. Then, configure your ‘claude_desktop_config.json’ with your Dify API endpoint and key to enable the server.
Key features
Key features include integration with Dify AI chat completion API, a restaurant recommendation tool (meshi-doko), support for conversation context, streaming response support, and implementation in TypeScript.
Where to use
undefined
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Mcp Server Dify
mcp-server-dify is a Model Context Protocol Server designed for Dify AI, enabling large language models (LLMs) to interact with Dify AI’s chat completion capabilities through a standardized protocol.
Use cases
Use cases for mcp-server-dify include enhancing chatbots with restaurant recommendations, maintaining conversation context in dialogues, and facilitating real-time interactions with Dify AI’s chat capabilities.
How to use
To use mcp-server-dify, install it via NPM with the command ‘npm install @modelcontextprotocol/server-dify’. Then, configure your ‘claude_desktop_config.json’ with your Dify API endpoint and key to enable the server.
Key features
Key features include integration with Dify AI chat completion API, a restaurant recommendation tool (meshi-doko), support for conversation context, streaming response support, and implementation in TypeScript.
Where to use
undefined
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
mcp-server-dify
Model Context Protocol Server for Dify AI. This server enables LLMs to interact with Dify AI’s chat completion capabilities through a standardized protocol.
Features
- Integration with Dify AI chat completion API
- Restaurant recommendation tool (meshi-doko)
- Support for conversation context
- Streaming response support
- TypeScript implementation
Installation
Using Docker
# Build the Docker image
make docker
# Run with Docker
docker run -i --rm mcp/dify https://your-dify-api-endpoint your-dify-api-key
Usage
With Claude Desktop
Add the following configuration to your claude_desktop_config.json:
{
"mcpServers": {
"dify": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-dify",
"https://your-dify-api-endpoint",
"your-dify-api-key"
]
}
}
}
Replace your-dify-api-endpoint and your-dify-api-key with your actual Dify API credentials.
Tools
meshi-doko
Restaurant recommendation tool that interfaces with Dify AI:
Parameters:
LOCATION(string): Location of the restaurantBUDGET(string): Budget constraintsquery(string): Query to send to Dify AIconversation_id(string, optional): For maintaining chat context
Development
# Initial setup
make setup
# Build the project
make build
# Format code
make format
# Run linter
make lint
License
This project is released under the MIT License.
Security
This server interacts with Dify AI using your provided API key. Ensure to:
- Keep your API credentials secure
- Use HTTPS for the API endpoint
- Never commit API keys to version control
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










