- Explore MCP Servers
- mcp_proxy_pydantic_agent
Mcp Proxy Pydantic Agent
What is Mcp Proxy Pydantic Agent
mcp_proxy_pydantic_agent is an example project designed to expose MCP (Model Context Protocol) servers to Pydantic Agents, facilitating seamless integration between the two technologies.
Use cases
Use cases for mcp_proxy_pydantic_agent include querying real-time information like weather updates or time conversions, and building applications that require interaction with LLMs for dynamic responses.
How to use
To use mcp_proxy_pydantic_agent, clone the repository, run ‘uv sync’, navigate to the mcp-client directory, and execute either ‘uv run client.py’ or ‘uv run client2.py’ depending on your requirements. Ensure you have the necessary API keys for OpenAI and Anthropic.
Key features
Key features include the ability to integrate multiple LLMs, support for Pydantic.AI, and the option to run different client scripts for varied functionalities, such as direct interaction with Anthropic libraries or pure Pydantic usage.
Where to use
undefined
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Mcp Proxy Pydantic Agent
mcp_proxy_pydantic_agent is an example project designed to expose MCP (Model Context Protocol) servers to Pydantic Agents, facilitating seamless integration between the two technologies.
Use cases
Use cases for mcp_proxy_pydantic_agent include querying real-time information like weather updates or time conversions, and building applications that require interaction with LLMs for dynamic responses.
How to use
To use mcp_proxy_pydantic_agent, clone the repository, run ‘uv sync’, navigate to the mcp-client directory, and execute either ‘uv run client.py’ or ‘uv run client2.py’ depending on your requirements. Ensure you have the necessary API keys for OpenAI and Anthropic.
Key features
Key features include the ability to integrate multiple LLMs, support for Pydantic.AI, and the option to run different client scripts for varied functionalities, such as direct interaction with Anthropic libraries or pure Pydantic usage.
Where to use
undefined
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
Sample to show to integrate MCP (Model Context Protocol) servers with Pydantic.AI
Parts of this example uses content from : https://github.com/modelcontextprotocol/quickstart-resources.git - Esp. the weather ‘server’ code
Code uses two different LLMs just for demonstration. The Proxy Agent uses gpt-4o and the tool uses sonnet.
So, export OPENAI_API_KEY as well as ANTHROPIC_API_KEY - OR - modify the code to suit your models
The pyproject.toml assumes you are using ‘uv’ package manager
Steps to run
- Clone this repo
- uv sync
- cd mcp-client
- uv run client.py (this requires openai and anthropic keys and uses anthropic libs directly)
- uv run client2.py (for pure pydantic and works with any fn calling LLM)
(Alternatively try client2.py - this uses only PydanticAI - no direct dep on Anthropic libs)
Now, try interacting with some questions like:
What is the time in NY when it is 7:30pm in Bangalore?
What is the Weather currently in Chicago?
(and quit when done)
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










