- Explore MCP Servers
- mcpx-py
Mcpx Py
What is Mcpx Py
mcpx-py is a Python client library designed for interacting with AI models using tools from mcp.run, enabling developers to build applications that leverage large language models (LLMs).
Use cases
Use cases for mcpx-py include creating chatbots that summarize content, generating text based on prompts, and developing applications that require interaction with LLMs.
How to use
To use mcpx-py, install it via ‘uv add mcpx-py’ or ‘pip install mcpx-py’. Set up an mcp.run session ID using ‘npx --yes -p @dylibso/mcpx gen-session --write’, and then use the library to send messages to AI models.
Key features
Key features of mcpx-py include support for various AI models, structured output capabilities, and easy integration with mcp.run tools.
Where to use
mcpx-py can be used in fields such as AI development, chatbots, content generation, and any application that requires natural language processing.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Mcpx Py
mcpx-py is a Python client library designed for interacting with AI models using tools from mcp.run, enabling developers to build applications that leverage large language models (LLMs).
Use cases
Use cases for mcpx-py include creating chatbots that summarize content, generating text based on prompts, and developing applications that require interaction with LLMs.
How to use
To use mcpx-py, install it via ‘uv add mcpx-py’ or ‘pip install mcpx-py’. Set up an mcp.run session ID using ‘npx --yes -p @dylibso/mcpx gen-session --write’, and then use the library to send messages to AI models.
Key features
Key features of mcpx-py include support for various AI models, structured output capabilities, and easy integration with mcp.run tools.
Where to use
mcpx-py can be used in fields such as AI development, chatbots, content generation, and any application that requires natural language processing.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
mcpx-py
A Python library for interacting with LLMs using mcp.run tools
Features
AI Provider Support
mcpx-py supports all models supported by PydanticAI
Dependencies
uvnpmollama(optional)
mcp.run Setup
You will need to get an mcp.run session ID by running:
npx --yes -p @dylibso/mcpx gen-session --write
This will generate a new session and write the session ID to a configuration file that can be used
by mcpx-py.
If you need to store the session ID in an environment variable you can run gen-session
without the --write flag:
npx --yes -p @dylibso/mcpx gen-session
which should output something like:
Login successful! Session: kabA7w6qH58H7kKOQ5su4v3bX_CeFn4k.Y4l/s/9dQwkjv9r8t/xZFjsn2fkLzf+tkve89P1vKhQ
Then set the MPC_RUN_SESSION_ID environment variable:
$ export MCP_RUN_SESSION_ID=kabA7w6qH58H7kKOQ5su4v3bX_CeFn4k.Y4l/s/9dQwkjv9r8t/xZFjsn2fkLzf+tkve89P1vKhQ
Python Usage
Installation
Using uv:
uv add mcpx-py
Or pip:
pip install mcpx-py
Example code
from mcpx_py import Chat
llm = Chat("claude-3-5-sonnet-latest")
# Or OpenAI
# llm = Chat("gpt-4o")
# Or Ollama
# llm = Chat("ollama:qwen2.5")
# Or Gemini
# llm = Chat("gemini-2.0-flash")
response = llm.send_message_sync(
"summarize the contents of example.com"
)
print(response.data)
It’s also possible to get structured output by setting result_type
from mcpx_py import Chat, BaseModel, Field
from typing import List
class Summary(BaseModel):
"""
A summary of some longer text
"""
source: str = Field("The source of the original_text")
original_text: str = Field("The original text to be summarized")
items: List[str] = Field("A list of summary points")
llm = Chat("claude-3-5-sonnet-latest", result_type=Summary)
response = llm.send_message_sync(
"summarize the contents of example.com"
)
print(response.data)
More examples can be found in the examples/ directory
Command Line Usage
Installation
uv tool install mcpx-py
From git:
uv tool install git+https://github.com/dylibso/mcpx-py
Or from the root of the repo:
uv tool install .
uvx
mcpx-client can also be executed without being installed using uvx:
uvx --from mcpx-py mcpx-client
Or from git:
uvx --from git+https://github.com/dylibso/mcpx-py mcpx-client
Running
Get usage/help
mcpx-client --help
Chat with an LLM
mcpx-client chat
List tools
mcpx-client list
Call a tool
mcpx-client tool eval-js '{"code": "2+2"}'
LLM Configuration
Provider Setup
Claude
- Sign up for an Anthropic API account at https://console.anthropic.com
- Get your API key from the console
- Set the environment variable:
ANTHROPIC_API_KEY=your_key_here
OpenAI
- Create an OpenAI account at https://platform.openai.com
- Generate an API key in your account settings
- Set the environment variable:
OPENAI_API_KEY=your_key_here
Gemini
- Create an Gemini account at https://aistudio.google.com
- Generate an API key in your account settings
- Set the environment variable:
GEMINI_API_KEY=your_key_here
Ollama
- Install Ollama from https://ollama.ai
- Pull your desired model:
ollama pull llama3.2 - No API key needed - runs locally
Llamafile
- Download a Llamafile model from https://github.com/Mozilla-Ocho/llamafile/releases
- Make the file executable:
chmod +x your-model.llamafile - Run in JSON API mode:
./your-model.llamafile --json-api --host 127.0.0.1 --port 8080 - Use with the OpenAI provider pointing to
http://localhost:8080
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










