- Explore MCP Servers
- openapi-mcpserver-generator
Openapi Mcpserver Generator
What is Openapi Mcpserver Generator
openapi-mcpserver-generator is a command-line tool that generates Model Context Protocol (MCP) server code from OpenAPI specifications, facilitating the creation of an MCP server that connects Large Language Models (LLMs) with APIs.
Use cases
Use cases include generating MCP servers for AI applications, creating bridges between APIs and LLMs, and automating the setup of server infrastructure based on OpenAPI specifications.
How to use
To use openapi-mcpserver-generator, install it globally via npm, yarn, or pnpm. Then, run the command openapi-mcpserver-generator --openapi path/to/openapi.json --output /Path/to/output to generate an MCP server from your OpenAPI specification.
Key features
Key features include automatic tool generation for API endpoints, support for nested $ref in OpenAPI specifications, generation of MCP server configuration, easy environment-based configuration, and logging capabilities.
Where to use
openapi-mcpserver-generator can be used in software development environments where there is a need to integrate APIs with Large Language Models, particularly in applications involving AI and machine learning.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Openapi Mcpserver Generator
openapi-mcpserver-generator is a command-line tool that generates Model Context Protocol (MCP) server code from OpenAPI specifications, facilitating the creation of an MCP server that connects Large Language Models (LLMs) with APIs.
Use cases
Use cases include generating MCP servers for AI applications, creating bridges between APIs and LLMs, and automating the setup of server infrastructure based on OpenAPI specifications.
How to use
To use openapi-mcpserver-generator, install it globally via npm, yarn, or pnpm. Then, run the command openapi-mcpserver-generator --openapi path/to/openapi.json --output /Path/to/output to generate an MCP server from your OpenAPI specification.
Key features
Key features include automatic tool generation for API endpoints, support for nested $ref in OpenAPI specifications, generation of MCP server configuration, easy environment-based configuration, and logging capabilities.
Where to use
openapi-mcpserver-generator can be used in software development environments where there is a need to integrate APIs with Large Language Models, particularly in applications involving AI and machine learning.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
OpenAPI to MCP server Generator
A command-line tool that generates Model Context Protocol (MCP) server code from OpenAPI specifications. This tool helps you quickly create an MCP server that acts as a bridge between LLMs (Large Language Models) and your API.
English | 简体中文
At the beginning
This repo is originally forked from openapi-mcp-generator, and add some additional features:
- Support nested
$refin openapi specifications - Besides source code, generate MCP server configuration
- Allow client to set log level and send log message to client as notification
- When hit error, send message to stderr
- Support build docker image and guide client to run in docker container (2025/5/8 updated)
Features
- Automatic Tool Generation: Converts each API endpoint in your OpenAPI spec into an MCP tool
- Transport Options: Only supports stdio, for sse you can leveral mcp-proxy
- Complete Project Setup: Generates all necessary files to run an MCP server
- Easy Configuration: Simple environment-based configuration for the generated server
Installation
# Install globally from npm
npm install -g openapi-mcpserver-generator
# Or with yarn
yarn global add openapi-mcpserver-generator
# Or with pnpm
pnpm add -g openapi-mcpserver-generator
Usage
Generate an MCP server from an OpenAPI specification:
openapi-mcpserver-generator --openapi path/to/openapi.json --output /Path/to/output
Command Line Options
| Option | Alias | Description | Default |
|---|---|---|---|
--openapi |
-o |
Path or URL to OpenAPI specification | (required) |
--output |
-d |
Output directory for generated files | ./mcp-server |
--name |
-n |
Name for the MCP server | openapi-mcp-server |
--version |
-v |
Version for the MCP server | 1.0.0 |
--transport |
-t |
Transport mechanism (stdio, websocket, http) | stdio |
--help |
-h |
Show help information |
Examples
Generate from a local OpenAPI file:
openapi-mcpserver-generator --openapi ./specs/petstore.json --output ./petstore-mcp
Generate from a remote OpenAPI URL:
openapi-mcpserver-generator --openapi https://petstore3.swagger.io/api/v3/openapi.json --output ./petstore-mcp
Generated Files
The tool generates the following files in the output directory:
server.js- The main MCP server implementationpackage.json- Dependencies and scriptsREADME.md- Documentation for the generated server.env.example- Template for environment variablestypes.d.ts- TypeScript type definitions for the APItsconfig.json- TypeScript configurationDockerfile- Dockerfile.dockerignore- Docker ignore file
Using the Generated Server
After generating your MCP server:
-
Navigate to the generated directory:
cd my-mcp-server -
Install dependencies:
npm install -
Create an environment file:
cp .env.example .env -
Edit
.envto set your API base URL and any required headers:API_BASE_URL=https://api.example.com API_HEADERS=Authorization:Bearer your-token-here -
Start the server:
npm start
Requirements
- Node.js 16.x or higher
- npm 7.x or higher
E2E example
Suggest use mcpclihost as MCP host to take a try.
This tool(mcpclihost) could support both Azure Openai and deepseek
You can add generated MCP server congiguration like this:
{ "mcpServers": { "petstore-mcp": { "command": "/usr/local/bin/node", "args": [ "/Users/lipeng/workspaces/github.com/vincent-pli/openapi-mcpserver-generator/petstore-mcp/server.js", "run" ] } } }
to the ~/.mcp.json(default mcp server configuration path of mcpclihost), then take a try
Security Schemes in Openapi
Openapi 3.0 support 4 security types:
- apiKey:
for example:
"securitySchemes": { "my_api_key": { "type": "apiKey", "name": "api_key", "in": "header" } }
Expect a env param named upper cased MY_API_KEY_{securitySchemes.my_api_key.name}, in this case, it should be: MY_API_KEY_API_KEY defined in .env
- http:
"securitySchemes": { basicAuth: { type: "http", scheme: "basic" } }
it try to find BASICAUTH_USERNAME and BASICAUTH_PASSWORD in .env
"securitySchemes": { basicAuth: { type: "http", scheme: "bearer" } }
it try to find BASICAUTH_BEARERTOKEN in .env
- oauth2:
Because of the complexity of oauth2, cannot handle it automaticly, we suggest manually get theaccess token, then set it to.envas this:
API_HEADERS=Authorization:Bearer your-access-token-here
- openIdConnect
Not support yet
License
Apache 2.0
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










