MCP ExplorerExplorer

Mcpclient For Openai Api

@doubleheikeron 9 months ago
3 MIT
FreeCommunity
AI Systems
This is a simple MCP Client implement for OpenAI API request and response formats.

Overview

What is Mcpclient For Openai Api

MCPClient-for-OpenAI-API is a simple Python client designed to facilitate requests and responses to the OpenAI API. It serves as a testing tool for MCP servers that adhere to OpenAI API formats.

Use cases

Use cases include testing new API implementations, developing applications that require AI interactions, and experimenting with different models and contexts in a controlled environment.

How to use

To use MCPClient-for-OpenAI-API, ensure you have the required ‘mcp’ package and a ‘.env’ file configured with your proxy URL and OpenAI API key. Run the client using the command: uv run .\client.py <YourMCPSeverPath>\<servername>.py.

Key features

Key features include query input for calling the MCP server, automatic model selection from available models, customizable context size for conversation history, and the ability to clear context for new queries.

Where to use

MCPClient-for-OpenAI-API can be used in various fields that require interaction with OpenAI API formats, such as AI research, application development, and testing environments for machine learning models.

Content

MCPClient-for-OpenAI-API

This is a simple MCP Client Python implement for OpenAI API request and response formats.
It can be used to test MCP server with OpenAI API. Any API format follows OpenAI API can use this MCP Client. (eg. NewAPI, OneAPI, etc.)

Requirements

  • mcp package is essential.
  • A .env file is needed, contains the following:
OPENAI_BASE_URL=https://{yourProxyURL}/v1/ # No need for offical openai api key
OPENAI_API_KEY=sk-1234567890abcdefghijlmnopqrstuvwxyz

Functions

Query: Type your queries, if need call the mcp server, the LLM will call it.

Model selection: It will automanticlly gain the available models from your base url and show in the terminal. You can choose a model to use by yourself. The default model is gpt-4o. And you can change the model by inputting model to another any time you want.

Context setting: The default context size is 5, which means it can remain your (user/tool and assistant) 5 chat as conversation history. You can change the size by inputting context and set your own context size.

Clear context: When you want to start a new independent query, you can input clear to clear the conversation history.

Usage

uv run .\client.py <YourMCPSeverPath>\<servername>.py

Tools

No tools

Comments

Recommend MCP Servers

View All MCP Servers