- Explore MCP Servers
- MCP-ChatBot
Mcp Chatbot
What is Mcp Chatbot
MCP-ChatBot is a powerful chatbot framework built using Rust, designed to support multiple servers and integrate various tools seamlessly.
Use cases
Use cases for MCP-ChatBot include automating customer service inquiries, providing real-time assistance in applications, and creating engaging chat experiences for users.
How to use
To use MCP-ChatBot, you need to install the framework on your server, configure it according to your requirements, and then deploy it to start interacting with users across multiple platforms.
Key features
Key features of MCP-ChatBot include multi-server support, tool integration capabilities, and a robust architecture that leverages the performance of Rust.
Where to use
MCP-ChatBot can be used in various fields such as customer support, personal assistance, and interactive entertainment, making it versatile for different applications.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Mcp Chatbot
MCP-ChatBot is a powerful chatbot framework built using Rust, designed to support multiple servers and integrate various tools seamlessly.
Use cases
Use cases for MCP-ChatBot include automating customer service inquiries, providing real-time assistance in applications, and creating engaging chat experiences for users.
How to use
To use MCP-ChatBot, you need to install the framework on your server, configure it according to your requirements, and then deploy it to start interacting with users across multiple platforms.
Key features
Key features of MCP-ChatBot include multi-server support, tool integration capabilities, and a robust architecture that leverages the performance of Rust.
Where to use
MCP-ChatBot can be used in various fields such as customer support, personal assistance, and interactive entertainment, making it versatile for different applications.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
MCP-ChatBot
A powerful Rust-based chatbot MCP (Model Context Protocol) framework with multi-server support and tool integration capabilities.
Features
- 🤖 Multi-AI Support: Seamlessly switch between Ollama (local) and OpenAI
- 🛠️ Tool Integration: Built-in support for memory, SQLite, and file operations
- 🔄 Multi-Server Architecture: Run multiple specialized servers simultaneously
- 💬 Interactive CLI: User-friendly command-line interface with history
- 📝 Customizable Prompts: Server-specific system prompts via YAML configuration
- 🔒 Secure: Environment-based API key management
- 📚 RAG Support: Retrieval Augmented Generation with Qdrant vector database
- 🎤 Voice Input: Speech-to-text capabilities using Whisper
- 🤖 Advanced NLP: Powered by rust-bert for text embeddings and language models
- 🧠 Deep Learning: PyTorch integration via tch for advanced model operations
- 📝 Text Processing: Efficient tokenization with Hugging Face tokenizers
Model Context Protocol (MCP)
MCP(Model Context Protocol) is a flexible protocol designed to enhance AI model interactions by providing structured context and tool integration capabilities. The protocol enables:
Core Components
-
Context Management
- Dynamic context switching between different AI models
- Context persistence across sessions
- Server-specific context configurations
-
Tool Integration
- Standardized tool interface for AI models
- Automatic tool discovery and registration
- Tool execution with retry mechanisms
- Tool response processing and formatting
-
Server Architecture
- Modular server design for specialized operations
- Inter-server communication protocol
- Resource management and cleanup
- Server-specific prompt configurations
-
Protocol Features
- JSON-based message format
- Asynchronous operation support
- Error handling and recovery
- Resource cleanup and management
- Tool execution monitoring
Protocol Flow
-
Initialization
- Server registration and configuration
- Tool discovery and registration
- Context initialization
-
Operation
- Context-aware tool execution
- Response processing and formatting
- Error handling and recovery
- Resource management
-
Cleanup
- Resource release
- Server shutdown
- Context persistence
Use Cases
- Multi-Model Collaboration: Coordinate multiple AI models for complex tasks
- Tool Integration: Seamlessly integrate external tools and services
- Context Management: Maintain consistent context across different operations
- Resource Management: Efficiently manage system resources and cleanup
Prerequisites
- Rust 1.70 or higher
- Ollama (for local AI support)
- OpenAI API key (optional, for OpenAI support)
- Qdrant vector database (for RAG support)
- Whisper model (for voice input)
AI Models and Tools
The project leverages several powerful AI and NLP tools:
Text Processing and Embeddings
- rust-bert: A Rust implementation of Hugging Face’s transformers library
- Provides state-of-the-art text embeddings
- Supports multiple language models
- Enables efficient text processing and understanding
Deep Learning
- tch (PyTorch): Rust bindings for PyTorch
- Enables deep learning model operations
- Supports model inference and training
- Provides GPU acceleration when available
Text Tokenization
- tokenizers: Hugging Face’s tokenizers library
- Efficient text tokenization
- Supports multiple tokenization algorithms
- Enables consistent text processing across different models
Installation
- Clone the repository:
git clone https://github.com/arksong/mcp-chatbot.git
cd mcp-chatbot
- Build the project:
cargo build --release
Configuration
- Create a
.envfile in the project root:
LLM_API_KEY=your_ollama_key
OPENAI_API_KEY=your_openai_key # Optional
-
Configure servers in
src/servers_config.json -
Customize prompts in
mcp_prompts.yaml
Usage
Using Ollama (Local AI)
- Install Ollama:
# macOS
brew install ollama
# Linux
curl -fsSL https://ollama.ai/install.sh | sh
- Pull the required model:
ollama pull llama3.2:latest
- Start the Ollama service:
ollama serve
- Run the chatbot:
cargo run
Using OpenAI
- Set your OpenAI API key:
export OPENAI_API_KEY=your_api_key
- Run the chatbot:
cargo run
- Switch to OpenAI using the
/aicommand
Using RAG (Retrieval Augmented Generation)
- Start Qdrant vector database using Docker:
# Using the provided setup script
./scripts/setup_qdrant.sh
# Or manually using Docker
docker run -d \
--name qdrant \
-p 6333:6333 \
-p 6334:6334 \
-v "$(pwd)/qdrant_storage:/qdrant/storage" \
qdrant/qdrant:latest
- Add documents to the RAG database:
# Use the /rag-add command in the chatbot
/rag-add
# Then enter your document text
- Search similar documents:
# Use the /rag-search command
/rag-search
# Enter your search query
- View RAG database information:
/rag-info
- Docker Management Commands:
# Stop Qdrant container
docker stop qdrant
# Start Qdrant container
docker start qdrant
# View Qdrant logs
docker logs qdrant
# Remove Qdrant container (data will be preserved in qdrant_storage)
docker rm qdrant
Note: The Qdrant data is persisted in the ./qdrant_storage directory, which is mounted as a volume in the Docker container. This ensures your vector data remains intact even if the container is removed.
Using Voice Input
- Start voice recording:
/voice
-
Speak your message (press Enter to stop recording)
-
The transcribed text will be processed as a normal message
Available Commands
/help- Display help menu/clear- Clear the terminal screen/usage- Display usage information/exit- Exit the program/servers- List available MCP servers/tools- List available tools/resources- List available resources/debug- Toggle debug logging/ai- Switch between AI providers/rag-add- Add a new document to RAG database/rag-search- Search for similar documents/rag-info- Show RAG database information/voice- Start voice input (press Enter to stop recording)
Tool Examples
Memory Operations
SQLite Operations
File Operations
Project Structure
mcp-chatbot/ ├── src/ │ ├── main.rs # Main application entry │ ├── llm_client.rs # LLM client implementation │ ├── mcp_server.rs # MCP server core │ ├── protocol.rs # Protocol definitions │ ├── sqlite_server.rs # SQLite server implementation │ ├── stdio_server.rs # Standard I/O server │ ├── rag_server.rs # RAG server implementation │ ├── whisper_server.rs # Whisper server implementation │ └── utils.rs # Utility functions ├── tests/ │ ├── sqlite_test.rs # SQLite tests │ └── rag_server_test.rs # RAG server tests ├── Cargo.toml # Project dependencies ├── mcp_prompts.yaml # System prompts configuration └── README.md # Project documentation
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
License
This project is licensed under the MIT License - see the LICENSE file for details.
Author
- arkSong - Initial work - [email protected]
Acknowledgments
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










