- Explore MCP Servers
- think-mcp-host
Think Mcp Host
What is Think Mcp Host
Think MCP Host (AI·Zen·Love) is an intelligent agent application based on the Model Context Protocol (MCP) that facilitates seamless integration of various MCP resources, tools, and prompts into conversations, powered by multiple large language models (LLMs), vision language models (VLMs), and reasoning models.
Use cases
Use cases include automated customer service chatbots, content generation tools for writers, educational assistants for students, programming help for developers, and visual content analysis applications.
How to use
To use Think MCP Host, users can interact with the application through its rich terminal interface or chat interface. Users can input commands, utilize prompts, and integrate resources dynamically during conversations. The application supports various model types for different tasks, including text generation, image analysis, and logical reasoning.
Key features
Key features include complete MCP implementation, extensive model support (LLMs, VLMs, reasoning models), advanced conversation management (automatic history saving, export options), and a rich terminal interface with markdown rendering, syntax highlighting, and interactive command suggestions.
Where to use
Think MCP Host can be utilized in various fields such as customer support, content creation, education, software development, and any domain requiring intelligent conversational agents and resource integration.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Think Mcp Host
Think MCP Host (AI·Zen·Love) is an intelligent agent application based on the Model Context Protocol (MCP) that facilitates seamless integration of various MCP resources, tools, and prompts into conversations, powered by multiple large language models (LLMs), vision language models (VLMs), and reasoning models.
Use cases
Use cases include automated customer service chatbots, content generation tools for writers, educational assistants for students, programming help for developers, and visual content analysis applications.
How to use
To use Think MCP Host, users can interact with the application through its rich terminal interface or chat interface. Users can input commands, utilize prompts, and integrate resources dynamically during conversations. The application supports various model types for different tasks, including text generation, image analysis, and logical reasoning.
Key features
Key features include complete MCP implementation, extensive model support (LLMs, VLMs, reasoning models), advanced conversation management (automatic history saving, export options), and a rich terminal interface with markdown rendering, syntax highlighting, and interactive command suggestions.
Where to use
Think MCP Host can be utilized in various fields such as customer support, content creation, education, software development, and any domain requiring intelligent conversational agents and resource integration.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
Think MCP Host (AI·Zen·Love)
Think MCP Host (AI·Zen·Love) is a Model Context Protocol (MCP) based intelligent agent application that supports various types of large language models, including standard conversational models (LLM), vision language models (VLM), and reasoning models.
Features
-
Complete MCP (Model Context Protocol) Implementation
- Full MCP architecture support (Host/Client/Server)
- Comprehensive MCP resource types support
- Resources: Dynamic integration of external content
- Prompts: Template-based system prompts
- Tools: AI-powered function calls
- Dynamic MCP command insertion anywhere in conversations
- Seamless integration of resources into context
- On-demand prompt template usage
- Direct tool execution within chat
- Standalone MCP tool execution support
-
Extensive Model Support
- LLM (Language Models)
- Text conversations and content generation
- Programming and code assistance
- Document writing and analysis
- VLM (Vision Language Models)
- Image understanding and analysis
- Visual content processing
- Reasoning Models
- Complex logical analysis
- Professional domain reasoning
- Multiple provider support (DeepSeek, OpenAI, OpenRouter, etc.)
- LLM (Language Models)
-
Advanced Conversation Management
- Automatic conversation history saving
- Manual save options with countdown timer
- Historical conversation loading
- Multiple export format support
-
System Features
- Rich Terminal Interface
- Beautiful markdown rendering in terminal
- Syntax highlighting for code blocks
- Unicode and emoji support
- Interactive command suggestions
- Cross-Platform Support
- Full functionality on Windows, macOS, and Ubuntu
- Native installation support for each platform
- Consistent user experience across systems
- Command-line interface
- Debug mode support
- Flexible exit options with save/discard choices
- Rich Terminal Interface
Usage Guide
Running Mode Selection
The program supports two main running modes:
-
Chat Mode (Default)
- Used for natural language dialogue
- Supports multiple LLM models
- Can use MCP enhancement features
-
Tool Mode
- Used for using specific AI tools
- Directly calls functions provided by MCP server
Detailed Usage Process
- Select Running Mode
- After program startup, you will be prompted to select running mode
- Enter
1to select Chat mode - Enter
2to select Tool mode
-
Chat Mode Setup Process
-
Select LLM Model
- System will display available model list
- Enter corresponding number to select model
- Supported models include DeepSeek, Silicon Flow, Volcano Engine, etc.
-
Choose Start Method
- Option 1: Set system prompt, then start new conversation
- Option 2: Directly start new conversation (default)
- Option 3: Load historical conversation
-
System Prompt Setting (if Option 1 was selected)
- Can input custom system prompt
- Supports using
->mcpcommand to insert MCP resources
-
Load Historical Conversation (if Option 3 was selected)
- System will display available historical conversation list
- Select conversation record to load
-
-
Tool Mode Setup Process
-
Select MCP Client
- System will display available MCP client list
- Select client to use
-
Select Tool
- Displays tool list provided by selected client
- Select specific tool to use
-
Execute Tool
- Provide necessary parameters according to tool requirements
- View tool execution results
-
Continue or Exit
- Choose whether to continue using other tools
- Can switch back to Chat mode at any time
-
Basic Chat Mode
- Start Conversation
- Directly input text to converse
- Use
Ctrl+Cto exit program
MCP Enhanced Mode
During conversation, you can use the ->mcp command to use MCP’s enhancement features. Steps are as follows:
-
Activate MCP Command
- Input
->mcpalone and press Enter in conversation - System will guide you through subsequent selections
- Input
-
Select MCP Client
- System will display available MCP client list
- Select client to use
-
Select MCP Feature Type
System will prompt you to select one of three types:-
Resources
- Input
1to select - Used for selecting and referencing external resources (like images, documents, etc.)
- Returns format:
->mcp_resources[client_name]:resourceURI
- Input
-
Prompts
- Input
2to select - Used for selecting predefined prompt templates
- Returns format:
->mcp_prompts[client_name]:prompt_name{parameters}
- Input
-
Tools

- Input
3to select - Used for selecting and using specific AI tools
- Returns format:
->mcp_tools[client_name]:tool_name{parameters}
- Input
-
-
Complete Selection
- After selection is complete, system will insert corresponding MCP command in conversation
- You can continue editing message, or send directly
-
Select Running Mode
- After program startup, you will be prompted to select running mode
- Enter
1to select Chat mode - Enter
2to select Tool mode
-
Chat Mode Setup Process
-
Select LLM Model
- System will display available model list
- Enter corresponding number to select model
- Supported models include DeepSeek, Silicon Flow, Volcano Engine, etc.
-
Choose Start Method
- Option 1: Set system prompt, then start new conversation
- Option 2: Directly start new conversation (default)
- Option 3: Load historical conversation
-
System Prompt Setting (if Option 1 was selected)
- Can input custom system prompt
- Supports using
->mcpcommand to insert MCP resources
-
Load Historical Conversation (if Option 3 was selected)
- System will display available historical conversation list
- Select conversation record to load
-
-
Tool Mode Setup Process
-
Select MCP Client
- System will display available MCP client list
- Select client to use
-
Select Tool
- Displays tool list provided by selected client
- Select specific tool to use
-
Execute Tool
- Provide necessary parameters according to tool requirements
- View tool execution results
-
Continue or Exit
- Choose whether to continue using other tools
- Can switch back to Chat mode at any time
-
Basic Chat Mode
- Start Conversation
- Directly input text to converse
- Use
Ctrl+Cto exit program
MCP Enhanced Mode
During conversation, you can use the ->mcp command to use MCP’s enhancement features. Steps are as follows:
-
Activate MCP Command
- Input
->mcpalone and press Enter in conversation - System will guide you through subsequent selections
- Input
-
Select MCP Client
- System will display available MCP client list
- Select client to use
-
Select MCP Feature Type
System will prompt you to select one of three types:-
Resources
- Input
1to select - Used for selecting and referencing external resources (like images, documents, etc.)
- Returns format:
->mcp_resources[client_name]:resourceURI
- Input
-
Prompts
- Input
2to select - Used for selecting predefined prompt templates
- Returns format:
->mcp_prompts[client_name]:prompt_name{parameters}
- Input
-
Tools
- Input
3to select - Used for selecting and using specific AI tools
- Returns format:
->mcp_tools[client_name]:tool_name{parameters}
- Input
-
-
Complete Selection
- After selection is complete, system will insert corresponding MCP command in conversation
- You can continue editing message, or send directly
Installation and Running
Development Installation
Before installing from package repositories, you can install the project directly from source for development:
# Clone the repository
git clone https://github.com/thinkthinking/think-mcp-host.git
cd think-mcp-host
# Create and activate virtual environment
python -m venv .venv
source .venv/bin/activate # On Linux/macOS
# or
.venv\Scripts\Activate.ps1 # On Windows with PowerShell
# Install in development mode with pip
pip install -e .
# or with uv (recommended)
uv pip install -e .
Windows
- Installation methods
- Download and double-click
AI-Zen-Love.exe - Or install and run via command line:
- Download and double-click
# Install uv using pip
python -m pip install uv
# Clone the project and install
git clone https://github.com/thinkthinking/think-mcp-host.git
cd think-mcp-host
python -m venv .venv
.venv\Scripts\Activate.ps1
uv pip install -e .
- Configuration file locations
- LLM configuration:
C:\Users\your-username\.think-llm-client\config\servers_config.json - MCP configuration:
C:\Users\your-username\.think-mcp-client\config\mcp_config.json - History records:
C:\Users\your-username\.think-mcp-host\command_history\
- LLM configuration:
macOS
- Installation methods
- Download and double-click
AI-Zen-Love.app - Or install and run via terminal:
- Download and double-click
# Install uv
python3 -m pip install uv
# Clone the project and install
git clone https://github.com/thinkthinking/think-mcp-host.git
cd think-mcp-host
python3 -m venv .venv
source .venv/bin/activate
uv pip install -e .
- Configuration file locations
- LLM configuration:
/Users/your-username/.think-llm-client/config/servers_config.json - MCP configuration:
/Users/your-username/.think-mcp-client/config/mcp_config.json - History records:
/Users/your-username/.think-mcp-host/command_history/
- LLM configuration:
Configuration Details
Model Configuration
The project supports three types of models:
-
LLM (Language Models)
- Used for: Text conversations, code writing, document generation
- Examples: DeepSeek Chat, GPT-4
-
VLM (Vision Language Models)
- Used for: Image understanding and analysis
- Examples: GPT-4-Vision, Qwen-VL-Plus
-
Reasoning Models
- Used for: Complex reasoning and professional analysis
- Examples: DeepSeek Reasoner, DeepSeek-R1
LLM Configuration
The configuration file uses JSON format and needs to be configured according to different model types:
{
"llm": {
"providers": {
"deepseek": {
"api_key": "<DEEPSEEK_API_KEY>",
"api_url": "https://api.deepseek.com",
"model": {
"deepseek-chat": {
"max_completion_tokens": 8192
}
}
}
}
},
"vlm": {
"providers": {
"openai": {
"api_key": "<OPENAI_API_KEY>",
"api_url": "https://api.openai.com/v1",
"model": {
"gpt-4-vision": {
"max_completion_tokens": 4096
}
}
}
}
},
"reasoning": {
"providers": {
"deepseek": {
"api_key": "<DEEPSEEK_API_KEY>",
"api_url": "https://api.deepseek.com",
"model": {
"deepseek-reasoner": {
"max_completion_tokens": 8192,
"temperature": 0.6
}
}
}
}
}
}
Configuration explanation:
- Choose the configuration area according to model type (llm/vlm/reasoning)
- Multiple providers can be configured under each type
Configuration instructions for different providers are as follows:
- DeepSeek Documentation: https://api-docs.deepseek.com/en/
- Silicon Flow Documentation: https://docs.siliconflow.cn/en/userguide/quickstart#4-siliconcloud-api-genai
- Volcano Engine Documentation: https://www.volcengine.com/docs/82379/1399008
- Each provider needs to configure:
api_key: API keyapi_url: API server addressmodel: Specific model configurationmax_completion_tokens: Maximum output lengthtemperature: Temperature parameter (optional)
MCP Server Configuration
MCP (Model Context Protocol) server configuration example:
{
"mcpServers": {
"think-mcp": {
"command": "/opt/homebrew/bin/uv",
"args": [
"--directory",
"/Users/thinkthinking/src_code/nas/think-mcp",
"run",
"think-mcp"
]
}
}
}
MCP Commands
The following MCP command formats can be used in conversations:
- Interactive Selection
->mcp
This will start an interactive selection interface, guiding you to choose:
- MCP client
- Operation type (Resources/Prompts/Tools)
- Specific resource/prompt/tool
- Related parameters (if needed)
- Direct Usage
# Use resources
->mcp_resources[client_name]:resource_uri
# Use prompts
->mcp_prompts[client_name]:prompt_name{param1:value1,param2:value2}
# Use tools
->mcp_tools[client_name]:tool_name{param1:value1,param2:value2}
Examples:
# Use prompts
->mcp_prompts[think-mcp]:agent-introduction{agent_name:AI Assistant,agent_description:A friendly AI assistant}
# Use tools
->mcp_tools[think-mcp]:analyze_content{text:This is a test text}
Features
- Support for multiple MCP commands in the same input
- Commands can be edited and modified at any time
- Parameters support flexible key-value pair format
- Friendly error prompts
### MCP Commands The following MCP command formats can be used in conversations: 1. Interactive Selection ```bash ->mcp
This will start an interactive selection interface, guiding you to choose:
- MCP client
- Operation type (Resources/Prompts/Tools)
- Specific resource/prompt/tool
- Related parameters (if needed)
- Direct Usage
# Use resources
->mcp_resources[client_name]:resource_uri
# Use prompts
->mcp_prompts[client_name]:prompt_name{param1:value1,param2:value2}
# Use tools
->mcp_tools[client_name]:tool_name{param1:value1,param2:value2}
Examples:
# Use prompts
->mcp_prompts[think-mcp]:agent-introduction{agent_name:AI Assistant,agent_description:A friendly AI assistant}
# Use tools
->mcp_tools[think-mcp]:analyze_content{text:This is a test text}
Features
- Support for multiple MCP commands in the same input
- Commands can be edited and modified at any time
- Parameters support flexible key-value pair format
- Friendly error prompts
Releasing New Versions
To release a new version, follow these steps:
-
Update the version number:
- Update the
versionfield inpyproject.toml - Follow Semantic Versioning
- Update the
-
Commit changes:
git add pyproject.toml
git commit -m "chore: bump version to x.x.x"
- Create version tag:
git tag vx.x.x git push origin vx.x.x
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










