- Explore MCP Servers
- agent-with-mcp-example
Agent With Mcp Example
What is Agent With Mcp Example
agent-with-mcp-example is a project that demonstrates how to use an agent with a local MCP server to obtain system CPU and memory statistics. It utilizes the psutil library for data collection and exposes the functionalities through FastAPI endpoints.
Use cases
Use cases include monitoring CPU and memory usage in development environments, providing insights for system administrators, and integrating with other applications that require resource statistics.
How to use
To use agent-with-mcp-example, ensure that you have the necessary prerequisites installed, including uv, direnv, and optionally mcptools. Configure your environment by copying envrc.template to .envrc and setting your OpenAI API key. Then, run the application to interact with the agent via a Gradio chat interface.
Key features
Key features include integration with the psutil library for system resource monitoring, FastAPI for creating endpoints, and a Gradio chat interface that maintains conversation history for follow-up questions.
Where to use
agent-with-mcp-example can be used in various fields such as system monitoring, performance analysis, and resource management in software applications that require real-time data on system resources.
Overview
What is Agent With Mcp Example
agent-with-mcp-example is a project that demonstrates how to use an agent with a local MCP server to obtain system CPU and memory statistics. It utilizes the psutil library for data collection and exposes the functionalities through FastAPI endpoints.
Use cases
Use cases include monitoring CPU and memory usage in development environments, providing insights for system administrators, and integrating with other applications that require resource statistics.
How to use
To use agent-with-mcp-example, ensure that you have the necessary prerequisites installed, including uv, direnv, and optionally mcptools. Configure your environment by copying envrc.template to .envrc and setting your OpenAI API key. Then, run the application to interact with the agent via a Gradio chat interface.
Key features
Key features include integration with the psutil library for system resource monitoring, FastAPI for creating endpoints, and a Gradio chat interface that maintains conversation history for follow-up questions.
Where to use
agent-with-mcp-example can be used in various fields such as system monitoring, performance analysis, and resource management in software applications that require real-time data on system resources.
Content
Agent with MCP Example
This project provides a simple example of an Agent and a local MCP server.
The MCP Server provides a collection of tools for obtaining system CPU and memory statistics.
It is built on the psutil library. The tools are implemented
as FastAPI enpoints and then exposed via MCP using fastapi-mcp.
The Agent is part of a simple Gradio chat application. The agent uses the Pydantic.ai
agent framework. The agent is provided the MCP Server’s
URL and a system prompt indicating that it should answer system resource usage. The Gradio
Chat component maintains a conversation history so that you can ask follow-up questions.
Setup
Prerequisites
First make sure you have the following tools installed on your machine:
- uv, a package and environment manager for Python
- direnv, a tool for managing environment variables in your projects
- mcptools (optional), a command line utility for interacting
with MCP servers. This program is only needed if you want to test/debug the MCP server without
the chat application. It is really helpful for debugging your tools and making sure that the
expected metadata is being published by the MCP server. Note that the name of the program is
mcpt
if you install via Homebrew on Mac andmcptools
otherwise. - These examples use OpenAI models for the Agent, so you will need an actve account and key
from here. Alternatively, you can use one of the
other models supported by Pydantic.ai. In that case, you will have to set the model and key
appropriately.
Setup steps
Once you have the prerequisites installed, do the following steps:
- Copy envrc.template to .envrc and edit the value of OPENAI_API_KEY to your Open AI token.
- Run
direnv allow
to put the changed environment variables into your environment. - Run
uv sync
to create/update your virtual environment. - You can start the MCP Server with
uv run psutil_mcp.py
. By default it will server on port 8000.
Testing
If you have installed mcptools, you can connect to your MCP server and test it as follows:
$ mcptools shell http://localhost:8000/mcp # use the command "mcpt" if you installed via Homebrew
mcp> tools
cpu_times
Get Cpu Times Return system CPU time as a total across all cpus. Every attribute represents the
...
mcp> call cpu_times
{
"user": 119528.44,
"nice": 0.0,
"system": 67114.2,
"idle": 2692773.55
}
mcp> exit
Running
To run the full application:
- If you have not already stared your MCP Server, you can run it as
uv run psutil_mcp.py
- In another terminal window, start the chat server with
uv run chat.py
- Point your browser to http://127.0.0.1:7860
Extras
The psutil_mcp.py and chat.py programs have some command line options to enable debugging, change the
model, change the ports, etc. Run them with the --help
option to see the available options.
There is a configuration for VSCode to use the MCP server at .vscode/mcp.json