- Explore MCP Servers
- mcp-openai
Mcp Openai
What is Mcp Openai
mcp-openai is an MCP client that provides an OpenAI compatible API, allowing applications to interact with Large Language Models (LLMs) using the Model Context Protocol (MCP).
Use cases
Use cases include building user interfaces for LLMs, integrating with local inference engines, and developing applications that leverage text generation and function calling capabilities.
How to use
To use mcp-openai, add it to your project dependencies using ‘uv add mcp-openai’ or ‘pip install mcp-openai’. You can create an MCP client by configuring it with your custom settings in Python.
Key features
Key features include compatibility with OpenAI API, support for various local inference engines, and the ability to manage Python installations and virtual environments using uv.
Where to use
mcp-openai can be used in AI applications that require interaction with LLMs, particularly in environments that support the OpenAI API.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Mcp Openai
mcp-openai is an MCP client that provides an OpenAI compatible API, allowing applications to interact with Large Language Models (LLMs) using the Model Context Protocol (MCP).
Use cases
Use cases include building user interfaces for LLMs, integrating with local inference engines, and developing applications that leverage text generation and function calling capabilities.
How to use
To use mcp-openai, add it to your project dependencies using ‘uv add mcp-openai’ or ‘pip install mcp-openai’. You can create an MCP client by configuring it with your custom settings in Python.
Key features
Key features include compatibility with OpenAI API, support for various local inference engines, and the ability to manage Python installations and virtual environments using uv.
Where to use
mcp-openai can be used in AI applications that require interaction with LLMs, particularly in environments that support the OpenAI API.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
𝔐 mpc-openai ✧
MCP Client with OpenAI compatible API
Model Context Protocol (MCP) is an open protocol that standardizes how applications provide context to LLMs. Think of MCP like a USB-C port for AI applications.
[!WARNING]
This is a simple toy project. Support is not planned. Use as a reference for minimal MCP client development.
This is a MCP client (not a server). It is meant to be used as a library for building LLMs UI that support MCP through an OpenAI compatible API. This opens the door to locally runnable inference engines (vLLM, Ollama, TGI, llama.cpp, LMStudio, …) that support providing support for the OpenAI API (text generation, function calling, etc.).
Usage
It is highly recommended to use uv in your project based on mpc-openai:
- It manages python installation and virtual environment.
- It is an executable that can run self-contained python scripts (in our case MCP server)
- It is used for CI workflows.
Add mcp-openai to your project dependencies with:
uv add mcp-openai
or use classic pip install.
Create a MCP client
Now you can create a MCP client by specifying your custom configuration.
from mcp_openai import MCPClient
from mcp_openai import config
mcp_client_config = config.MCPClientConfig(
mcpServers={
"the-name-of-the-server": config.MCPServerConfig(
command="uv",
args=["run", "path/to/server/scripts.py/or/github/raw"],
)
# add here other servers ...
}
)
llm_client_config = config.LLMClientConfig(
api_key="api-key-for-auth",
base_url="https://api.openai.com/v1",
)
llm_request_config = config.LLMRequestConfig(model=os.environ["MODEL_NAME"])
client = MCPClient(
mcp_client_config,
llm_client_config,
llm_request_config,
)
Connect and process messages with MCP client
async def main():
# Establish connection between the client and the server.
await client.connect_to_server(server_name)
# messages_in are coming from user interacting with the LLM
# e.g. UI making use of this MCP client.
messages_in = ...
messages_out = await client.process_messages(messages_in)
# messages_out contains the LLM response. If required, the LLM make use of
# the available tools offered by the connected servers.
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










