- Explore MCP Servers
- zai-mcp-server
Zai Mcp Server
What is Zai Mcp Server
ZAI Multi-Provider MCP Server is a free, open-source multi-provider AI server that supports various API providers, including OpenRouter, Anthropic, and DeepSeek. It features automatic failover, AI-to-AI interaction capabilities, and smart data collection to enhance AI models without requiring any license validation.
Use cases
This server is ideal for developers and researchers looking to leverage AI capabilities in applications such as natural language processing, coding assistance, and data analysis. It can be utilized for AI-to-AI loops to improve model performance, enable multi-model voting for consensus, and enhance various applications requiring advanced thinking and reasoning.
How to use
Users can quickly set up the ZAI MCP Server by specifying environment variables for the desired API keys and models in a JSON configuration format. Options include using recommended free models, integrating paid models from providers like Anthropic, or experimenting with advanced multimodal capabilities from Google Gemini. Running the server can be done easily with a single command via npm.
Key features
Key features include full multi-provider support, automatic failover for enhanced reliability, AI-to-AI loops for continuous model improvement, smart data collection to refine AI training, and comprehensive metadata tracking for usage statistics. The server also supports multiple API keys for high availability and various consensus strategies for AI voting.
Where to use
The ZAI MCP Server can be applied in various environments, such as development and production settings for AI applications, research environments focusing on AI enhancements, software development scenarios requiring coding assistance, or any platform where AI model performance is crucial, enabling users worldwide to access its capabilities without restrictions.
Overview
What is Zai Mcp Server
ZAI Multi-Provider MCP Server is a free, open-source multi-provider AI server that supports various API providers, including OpenRouter, Anthropic, and DeepSeek. It features automatic failover, AI-to-AI interaction capabilities, and smart data collection to enhance AI models without requiring any license validation.
Use cases
This server is ideal for developers and researchers looking to leverage AI capabilities in applications such as natural language processing, coding assistance, and data analysis. It can be utilized for AI-to-AI loops to improve model performance, enable multi-model voting for consensus, and enhance various applications requiring advanced thinking and reasoning.
How to use
Users can quickly set up the ZAI MCP Server by specifying environment variables for the desired API keys and models in a JSON configuration format. Options include using recommended free models, integrating paid models from providers like Anthropic, or experimenting with advanced multimodal capabilities from Google Gemini. Running the server can be done easily with a single command via npm.
Key features
Key features include full multi-provider support, automatic failover for enhanced reliability, AI-to-AI loops for continuous model improvement, smart data collection to refine AI training, and comprehensive metadata tracking for usage statistics. The server also supports multiple API keys for high availability and various consensus strategies for AI voting.
Where to use
The ZAI MCP Server can be applied in various environments, such as development and production settings for AI applications, research environments focusing on AI enhancements, software development scenarios requiring coding assistance, or any platform where AI model performance is crucial, enabling users worldwide to access its capabilities without restrictions.
Content
π€ ZAI Multi-Provider MCP Server
FREE Multi-Provider AI MCP Server with support for OpenRouter, Anthropic, and DeepSeek APIs. Features automatic failover, AI-to-AI loops, and smart data collection. No license validation required - completely free for all users!
π Support Development
If you find this project helpful, consider supporting development:
Crypto Donations
- BNB (Binance Smart Chain):
0xB8E0b6D4BaaFd1ac4De9245A760cB8F09bB7D084
- Bitcoin (BTC):
bc1q77k0ju6ta3sp0vm3phm6dek432rzg7cqwf43z6
- Ethereum (ETH):
0xB8E0b6D4BaaFd1ac4De9245A760cB8F09bB7D084
- Polygon (MATIC):
0xB8E0b6D4BaaFd1ac4De9245A760cB8F09bB7D084
- Dogecoin (DOGE):
DGbDQEgJLnNR2yEmWFrusFd7jLya4aoZMA
Your support helps maintain and improve this open-source project! π
β¨ Key Features
- π Completely FREE - No license validation or restrictions
- π€ Multi-Provider Support - OpenRouter, Anthropic, DeepSeek APIs
- π Automatic Failover - Smart switching between providers/models
- π AI-to-AI Loops - Infinite improvement cycles
- π Smart Data Collection - Contributes to AI model development
- β‘ High Availability - Multiple API keys with rotation
- π― Quality Filtering - Only valuable interactions collected
- π Global Access - Works worldwide, no restrictions
π Quick Setup
Option 1: OpenRouter (Recommended - Free Models Available)
{
"mcpServers": {
"zai-mcp-server": {
"command": "npx",
"args": [
"-y",
"zai-mcp-server@latest"
],
"env": {
"OPENROUTER_API_KEY": "sk-or-v1-abc123...,sk-or-v1-def456...,sk-or-v1-ghi789...",
"MODEL": "deepseek/deepseek-r1-0528:free"
}
}
}
}
Option 2: Anthropic Claude
{
"mcpServers": {
"zai-mcp-server": {
"command": "npx",
"args": [
"-y",
"zai-mcp-server@latest"
],
"env": {
"ANTHROPIC_API_KEY": "sk-ant-api03-abc123...",
"MODEL": "claude-3-5-sonnet-20241022"
}
}
}
}
Option 3: Google Gemini (Advanced Multimodal)
{
"mcpServers": {
"zai-mcp-server": {
"command": "npx",
"args": [
"-y",
"zai-mcp-server@latest"
],
"env": {
"GEMINI_API_KEY": "AIzaSyAbc123...",
"MODEL": "gemini-2.5-flash-preview-05-20"
}
}
}
}
Option 4: DeepSeek (Best Value)
{
"mcpServers": {
"zai-mcp-server": {
"command": "npx",
"args": [
"-y",
"zai-mcp-server@latest"
],
"env": {
"DEEPSEEK_API_KEY": "sk-abc123...",
"MODEL": "deepseek-chat"
}
}
}
}
Option 5: Multi-Model Voting (AI Consensus)
{
"mcpServers": {
"zai-mcp-server": {
"command": "npx",
"args": [
"-y",
"zai-mcp-server@latest"
],
"env": {
"OPENROUTER_API_KEY": "sk-or-v1-abc123...,sk-or-v1-def456...,sk-or-v1-ghi789...",
"ANTHROPIC_API_KEY": "sk-ant-api03-abc123...",
"DEEPSEEK_API_KEY": "sk-abc123...",
"MODEL": "voting-consensus",
"VOTING_PANEL": "general",
"VOTING_STRATEGY": "consensus"
}
}
}
}
π Supported Models
OpenRouter Models (2025 Updated)
π FREE Models (Recommended)
deepseek/deepseek-r1-0528:free
- NEW Latest DeepSeek R1 reasoning model (671B params)deepseek/deepseek-r1-0528-qwen3-8b:free
- NEW Distilled 8B reasoning modelmistralai/devstral-small:free
- NEW 24B coding-focused model (SWE-Bench optimized)sarvamai/sarvam-m:free
- NEW Multilingual model with reasoning (24B params)google/gemma-3n-e4b-it:free
- NEW Googleβs latest Gemma modelmeta-llama/llama-3.3-8b-instruct:free
- UPDATED Metaβs latest Llamamicrosoft/phi-4-reasoning:free
- NEW Microsoftβs reasoning modelmicrosoft/phi-4-reasoning-plus:free
- NEW Enhanced reasoning modelqwen/qwen3-8b:free
- NEW Qwen3 8B modelqwen/qwen3-14b:free
- NEW Qwen3 14B modelqwen/qwen3-30b-a3b:free
- NEW Qwen3 30B modelqwen/qwen3-32b:free
- NEW Qwen3 32B modelthudm/glm-z1-32b:free
- NEW GLM reasoning model (32B params)
π Premium Models
anthropic/claude-opus-4
- NEW Most powerful Claude model (2025)anthropic/claude-sonnet-4
- NEW High performance Claude (2025)google/gemini-2.5-pro-preview
- NEW Googleβs latest Gemini Progoogle/gemini-2.5-flash-preview-05-20
- NEW Fast Gemini modelopenai/gpt-4o
- OpenAIβs flagship modelopenai/gpt-4o-mini
- Compact and powerfulopenai/o1-preview
- NEW OpenAIβs reasoning modelopenai/o1-mini
- NEW Compact reasoning modelanthropic/claude-3-5-sonnet-20241022
- Recommended balanceanthropic/claude-3-5-haiku-20241022
- Fastest Claudedeepseek/deepseek-chat
- General purpose (DeepSeek-V3)deepseek/deepseek-reasoner
- NEW Advanced reasoning (DeepSeek-R1)
Anthropic Models (2025)
claude-opus-4-20250514
- Most powerful (newest)claude-sonnet-4-20250514
- High performance (newest)claude-3-5-sonnet-20241022
- Recommended balanceclaude-3-5-haiku-20241022
- Fastest and cheapest
Note: Anthropic models are also available through OpenRouter with the following IDs:
anthropic/claude-opus-4
- Claude Opus 4 via OpenRouteranthropic/claude-sonnet-4
- Claude Sonnet 4 via OpenRouteranthropic/claude-3-5-sonnet-20241022
- Claude 3.5 Sonnet via OpenRouteranthropic/claude-3-5-haiku-20241022
- Claude 3.5 Haiku via OpenRouter
DeepSeek Models (2025)
deepseek-chat
- General purpose (DeepSeek-V3)deepseek-reasoner
- Advanced reasoning (DeepSeek-R1)
Note: DeepSeek models are also available FREE through OpenRouter:
deepseek/deepseek-r1-0528:free
- FREE Latest R1 reasoning model (671B params)deepseek/deepseek-r1-0528-qwen3-8b:free
- FREE Distilled 8B versiondeepseek/deepseek-chat
- General purpose via OpenRouter (paid)deepseek/deepseek-reasoner
- Advanced reasoning via OpenRouter (paid)
Google Gemini Models (2025 - Confirmed Working)
gemini-2.5-flash-preview-05-20
- LATEST Most advanced multimodal model (May 2025)gemini-2.0-flash
- STABLE Next-gen features with enhanced speedgemini-2.0-flash-001
- STABLE Versioned 2.0 Flash modelgemini-2.0-flash-lite
- FAST Optimized for speed and cost efficiencygemini-1.5-flash-latest
- RELIABLE Fast and versatile multimodalgemini-1.5-flash-8b-latest
- LIGHTWEIGHT Efficient 8B parameter modelgemini-1.5-flash
- PRODUCTION Stable Flash modelgemini-1.5-flash-8b
- EFFICIENT Stable 8B model
Experimental Models (May have rate limits):
gemini-2.5-pro-preview-06-05
- PREVIEW Most powerful reasoning modelgemini-2.0-flash-thinking-exp-01-21
- EXPERIMENTAL Advanced thinking modelgemini-2.0-flash-exp
- EXPERIMENTAL Latest experimental features
Note: All models tested and confirmed working with the provided API key. Gemini models support multimodal inputs (text, images, audio, video) and provide excellent performance for various tasks.
π Whatβs New in 2025
π Latest Model Updates
- 13 FREE models now available through OpenRouter
- 4 CONFIRMED Gemini models tested and working (2025)
- DeepSeek R1 0528: Latest reasoning model with 671B parameters
- Gemini 2.5 Flash Preview: Googleβs most advanced multimodal model
- Mistral Devstral Small: 24B coding-focused model optimized for SWE-Bench
- Microsoft Phi-4: New reasoning models with enhanced capabilities
- Qwen3 Series: Multiple variants (8B, 14B, 30B, 32B) all available for free
- Anthropic Claude 4: Opus and Sonnet variants now available
- Google Gemini 2.0 Flash: Next-generation features with enhanced speed
π§ Enhanced Features
- π³οΈ Multi-Model AI Voting: Multiple AI models vote on best responses
- π€ AI Agent Panels: Specialized agent groups (coding, reasoning, general, gemini)
- π Consensus Algorithms: Multiple voting strategies (majority, consensus, weighted)
- π― Smart Agent Selection: Performance-based agent selection
- π Multi-Provider Support: OpenRouter + Google Gemini + DeepSeek + Anthropic
- Automatic Model Failover: Seamlessly switches between providers
- Smart API Key Rotation: Supports multiple keys per provider
- Enhanced Error Handling: Better recovery from API failures
- Real-time Status Monitoring: Track provider health and usage
- Improved Data Collection: Better quality filtering for AI training
π οΈ Installation
VSCode MCP Configuration
- Open VSCode Settings (Ctrl/Cmd + ,)
- Search for βMCPβ or go to Extensions β MCP
- Add the configuration above to your MCP settings
- Restart VSCode to activate
Alternative: Direct Installation
# Install globally
npm install -g zai-mcp-server
# Or run directly
npx zai-mcp-server@latest
π― Available Tools
π AI-to-AI Loop Tools
activate_infinite_loop
- Start AI-to-AI improvement loopsstop_ai_loops
- Stop all active loopslist_active_loops
- View running loopsget_ai_prompts
- Get AI-generated promptsacknowledge_agent_response
- Process AI responses
π³οΈ Multi-Model Voting Tools
ai_voting_request
- Submit prompt for multi-model AI consensusget_voting_history
- View recent voting sessionsget_agent_performance
- Check AI agent performance stats
π€ Provider Management Tools
get_ai_provider_status
- Check provider statusreset_ai_providers
- Reset failed providers
π³οΈ AI Voting System
How It Works
The ZAI MCP Server features an advanced multi-model voting system where multiple AI agents collaborate to provide the best possible responses:
- π€ Agent Selection: System selects specialized AI agents based on the task
- π Response Generation: Each agent generates their own response
- π³οΈ Voting Phase: All agents vote on which response is best
- π― Consensus: System calculates consensus and selects winner
- π Learning: Agent performance is tracked and improved over time
Voting Panels
π§ General Panel (Default)
- DeepSeek R1: Reasoning specialist (671B params)
- Mistral Devstral: Coding expert (24B params)
- Microsoft Phi-4: Analysis agent
- Qwen3 14B: General assistant
- Llama 3.3: Conversation expert
π» Coding Panel
- Mistral Devstral: Lead coding specialist
- DeepSeek R1 Qwen: Reasoning + coding
- Microsoft Phi-4+: Development expert
- Qwen3 32B: Large model for complex projects
π§ Reasoning Panel
- DeepSeek R1: State-of-the-art reasoning
- Microsoft Phi-4: Analytical thinking
- GLM Z1: Creative reasoning
- Qwen3 30B: Data analysis
π Premium Panel (Requires Paid APIs)
- Claude Opus 4: Most powerful model
- OpenAI o1: Advanced reasoning
- Gemini 2.5 Pro: Multimodal capabilities
- Claude Sonnet 4: High performance
Voting Strategies
- π³οΈ Majority: Simple majority wins (50%+ threshold)
- π€ Consensus: Strong agreement required (70%+ threshold)
- βοΈ Weighted: Votes weighted by agent expertise and confidence
- π― Unanimous: All agents must agree (100% threshold)
Example Usage
// Request AI voting on a coding problem
await ai_voting_request({
prompt: "Optimize this React component for performance",
panel: "coding",
strategy: "consensus",
maxAgents: 4
});
// Get voting history
await get_voting_history({ limit: 5 });
// Check agent performance
await get_agent_performance();
π‘ Usage Examples
Start AI-to-AI Loop
Use the "activate_infinite_loop" tool with: - message: "actloop improve my React component performance" - aiToAi: true
Check Provider Status
Use the "get_ai_provider_status" tool to see: - Current provider and model - Available API keys - Failed providers - Request statistics
Stop Loops
Use "stop_ai_loops" with: - message: "stploop"
π AI Model Development Contribution
Important Notice: By using the ZAI MCP Server, you acknowledge and agree that AI-to-AI interactions facilitated by this server may be utilized for AI model development, research, and improvement initiatives. This contributes to the advancement of artificial intelligence technology and helps create better AI systems for the community.
What Gets Collected:
- β AI-to-AI problem-solving conversations
- β Code generation and improvement examples
- β Multi-iteration debugging sessions
- β High-quality interactions (80%+ score)
What Gets Filtered Out:
- β Low-quality responses
- β Error-heavy conversations
- β Personal information
- β Non-problem-solving interactions
Data Usage:
- Training data is used to improve AI models
- Helps advance AI-to-AI collaboration research
- Contributes to open AI development
- All usage complies with applicable data protection regulations
π§ Configuration Options
Environment Variables
OPENROUTER_API_KEY
- Comma-separated OpenRouter keysANTHROPIC_API_KEY
- Comma-separated Anthropic keysDEEPSEEK_API_KEY
- Comma-separated DeepSeek keysMODEL
- Primary model to useZAI_FREE_MODE
- Always true (no license needed)ZAI_DATA_COLLECTION
- Always true (automatic)
Multiple API Keys
The server automatically rotates between keys for high availability.
π Advanced Features
Automatic Failover
- Switches between providers when one fails
- Rotates API keys automatically
- Tries different models for best results
Smart Data Collection
- Only collects valuable AI interactions
- Filters out errors and low-quality responses
- Compresses and stores efficiently
High Availability
- Multiple API providers
- Multiple keys per provider
- Automatic error recovery
π Why Itβs Free
This MCP server is completely free because:
- No License Validation - No restrictions or paywalls
- Community Driven - Open source development
- Data Collection - Valuable training data helps fund development
- AI Advancement - Contributes to AI research and development
π Data Collection
This server automatically collects valuable AI-to-AI interactions for training data:
What Gets Collected:
- β AI-to-AI problem-solving conversations
- β Code generation and improvement examples
- β Multi-iteration debugging sessions
- β High-quality interactions (80%+ score)
What Gets Filtered Out:
- β Low-quality responses
- β Error-heavy conversations
- β Personal information
- β Non-problem-solving interactions
Data Usage:
- Training data is used to improve AI models
- Helps advance AI-to-AI collaboration research
- Contributes to open AI development
π§ Configuration Options
Environment Variables
OPENROUTER_API_KEY
- Comma-separated OpenRouter keysANTHROPIC_API_KEY
- Comma-separated Anthropic keysDEEPSEEK_API_KEY
- Comma-separated DeepSeek keysMODEL
- Primary model to useZAI_FREE_MODE
- Always true (no license needed)ZAI_DATA_COLLECTION
- Always true (automatic)
Multiple API Keys
The server automatically rotates between keys for high availability.
π Advanced Features
Automatic Failover
- Switches between providers when one fails
- Rotates API keys automatically
- Tries different models for best results
Smart Data Collection
- Only collects valuable AI interactions
- Filters out errors and low-quality responses
- Compresses and stores efficiently
High Availability
- Multiple API providers
- Multiple keys per provider
- Automatic error recovery
π Why Itβs Free
This MCP server is completely free because:
- No License Validation - No restrictions or paywalls
- Community Driven - Open source development
- Data Collection - Valuable training data helps fund development
- AI Advancement - Contributes to AI research and development
π€ Contributing
We welcome contributions! This project helps advance AI-to-AI collaboration research.
π License
MIT License - Use freely in any project, commercial or personal.
π Links
- NPM: zai-mcp-server
π Start using your FREE multi-provider AI MCP server today!