- Explore MCP Servers
- Saqr-MCP
Saqr Mcp
What is Saqr Mcp
Saqr-MCP is a Python application that implements the Model Context Protocol (MCP) to enable AI assistant capabilities using local models. It features a client-server architecture where the client interacts with local models through Ollama, while the server provides tools for large language models (LLMs).
Use cases
Use cases for Saqr-MCP include developing AI chatbots, creating interactive AI-driven applications, enhancing customer service with automated responses, and conducting research with local AI models.
How to use
To use Saqr-MCP, ensure that Ollama is running with your chosen model. Clone the repository, set up a virtual environment, install dependencies, and run the client using ‘python main.py’. You can then type your queries in the interactive console.
Key features
Key features of Saqr-MCP include an interactive chat interface for querying the model, integration with web search tools like DuckDuckGo, local model inference via Ollama, an asynchronous architecture for efficient processing, and visual loading animations for an enhanced user experience.
Where to use
Saqr-MCP can be used in various fields such as AI research, software development, customer support automation, and any application requiring local AI model interactions.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Saqr Mcp
Saqr-MCP is a Python application that implements the Model Context Protocol (MCP) to enable AI assistant capabilities using local models. It features a client-server architecture where the client interacts with local models through Ollama, while the server provides tools for large language models (LLMs).
Use cases
Use cases for Saqr-MCP include developing AI chatbots, creating interactive AI-driven applications, enhancing customer service with automated responses, and conducting research with local AI models.
How to use
To use Saqr-MCP, ensure that Ollama is running with your chosen model. Clone the repository, set up a virtual environment, install dependencies, and run the client using ‘python main.py’. You can then type your queries in the interactive console.
Key features
Key features of Saqr-MCP include an interactive chat interface for querying the model, integration with web search tools like DuckDuckGo, local model inference via Ollama, an asynchronous architecture for efficient processing, and visual loading animations for an enhanced user experience.
Where to use
Saqr-MCP can be used in various fields such as AI research, software development, customer support automation, and any application requiring local AI model interactions.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
🦅 Saqr-MCP
Saqr-MCP is a powerful Python application that implements the Model Context Protocol (MCP) to enable advanced AI assistant capabilities. It supports both local models through Ollama and cloud models through Groq, providing a flexible client-server architecture. The server component offers a rich set of tools including web search, memory management, document generation, and advanced reasoning capabilities.
✨ Features
- 🤖 Interactive chat interface for querying models
- 🔄 Support for both local models (Ollama) and cloud models (Groq)
- 🔍 Advanced web search capabilities using Tavily API
- 📝 Word document generation from markdown content
- 🧠 Comprehensive memory management system using mem0
- 💭 Advanced reasoning and thought process tracking
- ⚡ Async architecture for efficient processing
- 🎨 Visual loading animations for better user experience
- 📊 Session-based thought logging and analysis
- 📄 Document generation with markdown support
📋 Prerequisites
- 🐍 Python 3.11 or higher
- 🦙 Ollama installed with local models (for local model usage)
- 📦 UV package manager (recommended)
🚀 Installation
-
Clone this repository:
git clone https://github.com/ahmedhassan456/Saqr-MCP.git cd saqr-mcp -
Create and activate a virtual environment (optional but recommended):
uv venv # On Windows .venv\Scripts\activate # On Unix or MacOS source venv/bin/activate -
Install dependencies:
uv add -r requirements.txt -
Set up environment variables:
- Copy
.env.exampleto.env - Configure the following variables:
- 🔑
MODEL_NAME: Your preferred Ollama model (e.g.,qwen3:1.7b) - 🔍
TAVILY_API_KEY: Your Tavily API key from Tavily website - ⚡
GROQ_MODEL_NAME: Your preferred Groq model name - 🔐
GROQ_API_KEY: Your Groq API key from Groq website - 🧠
MEM0_API_KEY: Your Mem0 API key from Mem0 website
- 🔑
- Copy
💻 Usage
-
For local model usage, ensure Ollama is running with your chosen model available
-
Configure the client:
- By default, the application uses Ollama client (
from src.ollama_client import SaqrMCPClient) - To use Groq instead, modify
main.pyto usefrom src.groq_client import SaqrMCPClient
- By default, the application uses Ollama client (
-
Run the client:
python main.py -
Type your queries in the interactive console:
MCP Client Started! Type your queries or 'quit' to exit. Query: -
Type
quitto exit the application
📁 Project Structure
- 📄
main.py- Entry point that starts the MCP client - 📂
src/- 🔄
ollama_client.py- MCP client implementation for Ollama models - ⚡
groq_client.py- MCP client implementation for Groq models - 🛠️
server.py- MCP server implementation with all tools - 📝
logger.py- Custom logging utilities with visual animations
- 🔄
🛠️ Available Tools
The server implements a comprehensive set of tools for various functionalities:
🔍 Web Search and Document Generation
- web_search: Performs real-time web searches using Tavily API to retrieve up-to-date information
- word_file_generator: Creates Microsoft Word documents from markdown content with proper formatting
🧠 Memory Management
- add_memory: Stores new memories with specified types and content in mem0
- get_all_memories: Retrieves all stored memories, optionally filtered by type
- search_memories: Performs semantic search through stored memories to find relevant information
💭 Reasoning and Thought Process
- think: Records thoughts and reasoning processes for complex problem-solving
- get_thoughts: Retrieves all thoughts recorded in the current session
- clear_thoughts: Clears all recorded thoughts from the current session
- get_thought_stats: Provides detailed statistics about recorded thoughts
⚙️ Environment Variables
| Variable | Description | Default |
|---|---|---|
🔑 OLLAMA_MODEL_NAME |
The name of the Ollama model to use (e.g., qwen3:1.7b) |
None |
🔍 TAVILY_API_KEY |
Tavily API Key for web search capabilities | None |
⚡ GROQ_MODEL_NAME |
The name of the Groq model to use | None |
🔐 GROQ_API_KEY |
Groq API Key for cloud model access | None |
🧠 MEM0_API_KEY |
Mem0 API Key for memory management | None |
📦 Dependencies
- 🔄
mcp[cli]- Model Context Protocol implementation - 🌐
httpx- HTTP client for Python - 📝
loguru- Python logging made simple - ⚡
groq- Groq API client - 🦙
ollama- Interface to Ollama for local models - 🔍
tavily-python- Search Engine tailored for AI agents - 🚀
fastapi- Web framework for building APIs - ⚡
uvicorn- ASGI server implementation - 📄
htmldocx- HTML to DOCX converter - 🔍
duckduckgo-search- Search engine integration - 📄
python-docx- DOCX file handling - 📝
markdown- Markdown processing - 🧠
mem0- Memory management system
📄 License
🤝 Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
- 🍴 Fork the repository
- 🌿 Create your feature branch (
git checkout -b feature/amazing-feature) - 💾 Commit your changes (
git commit -m 'Add some amazing feature') - 📤 Push to the branch (
git push origin feature/amazing-feature) - 🔄 Open a Pull Request
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










