- Explore MCP Servers
- tiangong-ai-mcp
Tiangong Ai Mcp
What is Tiangong Ai Mcp
TianGong-AI-MCP is a server that implements the TianGong AI Model Context Protocol (MCP), supporting both STDIO and SSE protocols for communication.
Use cases
Use cases include deploying AI models for chatbots, real-time analytics applications, and any system requiring efficient data streaming and processing.
How to use
To use TianGong-AI-MCP, you can start the server using STDIO or set up a remote SSE server. You can also run it using Docker for containerized deployment.
Key features
Key features include support for STDIO and SSE protocols, easy setup with Docker, and the ability to run locally or remotely.
Where to use
TianGong-AI-MCP can be used in various fields such as AI model deployment, real-time data processing, and interactive applications that require efficient communication protocols.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Tiangong Ai Mcp
TianGong-AI-MCP is a server that implements the TianGong AI Model Context Protocol (MCP), supporting both STDIO and SSE protocols for communication.
Use cases
Use cases include deploying AI models for chatbots, real-time analytics applications, and any system requiring efficient data streaming and processing.
How to use
To use TianGong-AI-MCP, you can start the server using STDIO or set up a remote SSE server. You can also run it using Docker for containerized deployment.
Key features
Key features include support for STDIO and SSE protocols, easy setup with Docker, and the ability to run locally or remotely.
Where to use
TianGong-AI-MCP can be used in various fields such as AI model deployment, real-time data processing, and interactive applications that require efficient communication protocols.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
TianGong-AI-MCP
TianGong AI Model Context Protocol (MCP) Server supports both STDIO, SSE and Streamable Http protocols.
Starting MCP Server
Client STDIO Server
npm install -g @tiangong-ai/mcp-server
npx dotenv -e .env -- \
npx -p @tiangong-ai/mcp-server tiangong-ai-mcp-stdio
Remote SSE Server
npm install -g @tiangong-ai/mcp-server
npm install -g supergateway
npx dotenv -e .env -- \
npx -y supergateway \
--stdio "npx -y -p @tiangong-ai/mcp-server tiangong-ai-mcp-stdio" \
--port 3001 \
--ssePath /sse --messagePath /message
Using Docker
# Build MCP server image using Dockerfile (optional)
docker build -t linancn/tiangong-ai-mcp-server:0.0.13 .
# Pull MCP server image
docker pull linancn/tiangong-ai-mcp-server:0.0.13
# Start MCP server using Docker
docker run -d \
--name tiangong-ai-mcp-server \
--publish 9277:9277 \
--env-file .env \
linancn/tiangong-ai-mcp-server:0.0.13
Development
Environment Setup
# Install Node.js
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.40.2/install.sh | bash
nvm install 22
nvm use
# Install dependencies
npm install
# Update dependencies
npm update && npm ci
Code Formatting
# Format code using the linter
npm run lint
Local Testing
STDIO Server
# Launch the STDIO Server using MCP Inspector
npm start
SSE Server
# Build and package the project
npm run build && npm pack
# Optionally, install supergateway globally
npm install -g supergateway
# Launch the SSE Server (If the parameter --baseUrl is configured, it should be set to a valid IP address or domain name)
npx dotenv -e .env -- \
npx -y supergateway \
--stdio "npx -y -p tiangong-ai-mcp-server-0.0.13.tgz tiangong-ai-mcp-stdio" \
--port 3001 \
--ssePath /sse \
--messagePath /message
# Launch MCP Inspector
npx @modelcontextprotocol/inspector
Publishing
npm login npm run build && npm publish
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










