- Explore MCP Servers
- mcp-client
Mcp Client
What is Mcp Client
mcp-client is a Python client designed for connecting to Machine-to-Machine Communication Protocol (MCP) servers, allowing interaction with both stdio and SSE MCP servers for LLM agents to utilize tools through a standardized protocol.
Use cases
Use cases for mcp-client include connecting to Python or JavaScript MCP servers for AI model interactions, utilizing different LLM providers for specific tasks, and managing conversations in applications that require real-time communication.
How to use
To use mcp-client, clone the repository, install the required packages, set up your API keys in a .env file, and run the client script with the server path or URL along with an optional LLM provider.
Key features
Key features include the ability to connect to both stdio and SSE MCP servers, an interactive chat interface with various LLM options, automatic result processing for tool calls, conversation history management, and detailed logging for debugging.
Where to use
mcp-client can be used in various fields including AI development, chatbot creation, and any application requiring machine-to-machine communication with LLM integration.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Mcp Client
mcp-client is a Python client designed for connecting to Machine-to-Machine Communication Protocol (MCP) servers, allowing interaction with both stdio and SSE MCP servers for LLM agents to utilize tools through a standardized protocol.
Use cases
Use cases for mcp-client include connecting to Python or JavaScript MCP servers for AI model interactions, utilizing different LLM providers for specific tasks, and managing conversations in applications that require real-time communication.
How to use
To use mcp-client, clone the repository, install the required packages, set up your API keys in a .env file, and run the client script with the server path or URL along with an optional LLM provider.
Key features
Key features include the ability to connect to both stdio and SSE MCP servers, an interactive chat interface with various LLM options, automatic result processing for tool calls, conversation history management, and detailed logging for debugging.
Where to use
mcp-client can be used in various fields including AI development, chatbot creation, and any application requiring machine-to-machine communication with LLM integration.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
MCP Client
A Python client for connecting to Machine-to-Machine Communication Protocol (MCP) servers. This client allows you to interact with both stdio and SSE MCP servers, enabling LLM agents to use tools through a standardized protocol.
Features
- Connect to stdio MCP servers (Python and JavaScript)
- Connect to SSE MCP servers
- Interactive chat interface with multiple LLM options:
- Anthropic Claude 3.5 Sonnet (default)
- OpenAI GPT-4o
- Google Gemini 2.0 Flash
- Tool calling with automatic result processing
- Conversation history management with refresh capability
- Detailed logging for debugging and monitoring
Prerequisites
- Python 3.8+
- API keys set as environment variables:
ANTHROPIC_API_KEYfor Anthropic ClaudeOPENAI_API_KEYfor OpenAI GPT modelsGOOGLE_API_KEYfor Google Gemini
Installation
- Clone the repository
- Install the required packages:
pip install mcp-protocol-client anthropic openai google-genai python-dotenv
- Create a
.envfile with your API keys:
ANTHROPIC_API_KEY=your_anthropic_api_key OPENAI_API_KEY=your_openai_api_key GOOGLE_API_KEY=your_google_api_key
- Create a
logsdirectory to store client logs:
mkdir logs
Usage
Basic Usage
python client.py <server_script_path_or_url> [llm_provider]
Where:
<server_script_path_or_url>is either a path to an MCP server script or URL to an SSE MCP server[llm_provider]is optional and can be one of:anthropic(default),openai, orgemini
Examples
- Connect to a Python MCP server using Anthropic Claude (default):
python client.py ./weather.py
- Connect to a JavaScript npm MCP server using OpenAI:
python client.py @playwright/mcp@latest openai
- Connect to an SSE MCP server using Google Gemini:
python client.py http://localhost:8000/sse gemini
Interactive Chat Commands
- Type your queries to interact with the LLM and tools
- Type
refreshto clear conversation history - Type
quitto exit the application
Development
The client includes VS Code launch configurations for various setups, making it easy to debug and test with different servers and LLM providers.
How It Works
The MCP Client:
- Connects to an MCP server (either stdio or SSE)
- Lists available tools from the server
- Processes user queries by:
- Sending the query to the selected LLM with available tools
- Detecting and executing tool calls when the LLM requests them
- Sending tool results back to the LLM for processing
- Providing the final response to the user
- Maintains conversation history for context
Logging
Logs are stored in logs/mcp_client.log and are also displayed in the console. The logging level can be adjusted in the client.py file.
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










