- Explore MCP Servers
- po3_MCP
Po3 Mcp
What is Po3 Mcp
po3_MCP is a lightweight Model Context Protocol (MCP) server implementation that allows access to OpenAI’s o3 model and other models via Poe’s API. It enables integration of Poe’s AI capabilities into MCP-compatible applications.
Use cases
Use cases for po3_MCP include building chatbots, creating intelligent assistants, automating content generation, and enhancing applications with natural language processing capabilities.
How to use
To use po3_MCP, clone the repository, set up a virtual environment, install the required dependencies, and configure your Poe API key in the .env file. Run the server using the command ‘python poe_o3_mcp_server.py’ and send MCP protocol messages through standard input/output.
Key features
Key features of po3_MCP include a simple implementation using FastMCP, direct integration with Poe’s API, model selection via command-line flags, asynchronous request handling, comprehensive error handling and logging, and easy setup and configuration.
Where to use
po3_MCP can be used in various fields that require AI capabilities, such as software development, data analysis, customer support, and any application that can benefit from advanced language models.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Po3 Mcp
po3_MCP is a lightweight Model Context Protocol (MCP) server implementation that allows access to OpenAI’s o3 model and other models via Poe’s API. It enables integration of Poe’s AI capabilities into MCP-compatible applications.
Use cases
Use cases for po3_MCP include building chatbots, creating intelligent assistants, automating content generation, and enhancing applications with natural language processing capabilities.
How to use
To use po3_MCP, clone the repository, set up a virtual environment, install the required dependencies, and configure your Poe API key in the .env file. Run the server using the command ‘python poe_o3_mcp_server.py’ and send MCP protocol messages through standard input/output.
Key features
Key features of po3_MCP include a simple implementation using FastMCP, direct integration with Poe’s API, model selection via command-line flags, asynchronous request handling, comprehensive error handling and logging, and easy setup and configuration.
Where to use
po3_MCP can be used in various fields that require AI capabilities, such as software development, data analysis, customer support, and any application that can benefit from advanced language models.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
Poe o3 MCP Server
A lightweight Model Context Protocol (MCP) server implementation that provides access to OpenAI’s o3 model and other models via Poe’s API. This server allows you to integrate Poe’s AI capabilities into any MCP-compatible application.
Features
- Simple MCP server implementation using FastMCP
- Direct integration with Poe’s API to access the o3 model and other models
- Model selection via command-line style flags in prompts
- Asynchronous request handling for efficient processing
- Comprehensive error handling and logging
- Easy setup and configuration
Prerequisites
- Python 3.8+
- A Poe API key (obtainable from https://poe.com/api_key)
Installation
-
Clone this repository:
git clone https://github.com/Anansitrading/po3_MCP.git cd po3_MCP -
Create a virtual environment (recommended):
python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate -
Install the required dependencies:
pip install -r requirements.txt -
Set up your environment variables:
cp sample.env .env -
Edit the
.envfile and add your Poe API key:POE_API_KEY=your_poe_api_key_here
Usage
Running the MCP Server
Run the server with:
python poe_o3_mcp_server.py
The server will start and listen for MCP protocol messages on standard input/output.
Model Selection via Flags
You can select different models available on Poe by adding a flag to your prompt:
--Claude-3.5-Sonnet Tell me about quantum computing
This will route your query to the Claude-3.5-Sonnet model instead of the default o3 model.
The flag can be placed anywhere in the message:
- At the beginning:
--GPT-4 What is the capital of France? - In the middle:
Tell me --Claude-3-Opus about the history of Rome - At the end:
What are the three laws of robotics? --Claude-3.5-Sonnet
The flag will be automatically removed from the message before it’s sent to the model.
If no flag is specified, the server defaults to using the “o3” model.
Integrating with MCP Clients
This server provides two tools:
o3_query- Send a query to the o3 model (or another model via flags) and get a responseping- A simple test tool that returns “pong”
Example of using the server with an MCP client:
from mcp.client import MCPClient
# Connect to the MCP server
client = MCPClient(server_command=["python", "path/to/poe_o3_mcp_server.py"])
# Call the o3_query tool with the default o3 model
response = client.call_tool("o3_query", {"message": "Tell me about quantum computing"})
print(response)
# Call the o3_query tool with a different model using a flag
response = client.call_tool("o3_query", {"message": "--Claude-3.5-Sonnet Tell me about quantum computing"})
print(response)
# Test the connection with ping
ping_response = client.call_tool("ping", {})
print(ping_response) # Should print "pong"
You can also run the included example script:
python example.py
Configuration
The server uses the following environment variables:
POE_API_KEY: Your Poe API key (required)LOG_LEVEL: Logging level (optional, defaults to DEBUG)
Troubleshooting
If you encounter issues:
- Check that your Poe API key is valid and correctly set in the
.envfile - Ensure you have the correct dependencies installed
- Check the server logs for detailed error messages
- Verify that you have an active internet connection
- If using a model flag, make sure the model name is correct and available on Poe
License
MIT
Acknowledgements
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










