- Explore MCP Servers
- vercel-ai-docs-mcp
Vercel Ai Docs Mcp
What is Vercel Ai Docs Mcp
vercel-ai-docs-mcp is a Model Context Protocol (MCP) server designed to provide AI-powered search and querying capabilities specifically for the Vercel AI SDK documentation. It allows developers to ask questions and receive accurate, contextualized answers based on the official documentation.
Use cases
Use cases for vercel-ai-docs-mcp include assisting developers in troubleshooting issues with the Vercel AI SDK, providing quick answers to common questions, and serving as a learning tool for new developers trying to understand the SDK’s functionalities.
How to use
To use vercel-ai-docs-mcp, developers need to set up the server by installing the necessary prerequisites such as Node.js and npm, and obtaining a Google API key for the Gemini model. After cloning the repository and configuring the environment variables, the server can be run to enable querying the Vercel AI SDK documentation.
Key features
Key features of vercel-ai-docs-mcp include direct documentation search using similarity search, an AI-powered agent for natural language questions, session management to maintain conversation context, and automated indexing tools for the latest documentation.
Where to use
vercel-ai-docs-mcp is primarily used in software development environments where developers need quick and accurate access to the Vercel AI SDK documentation, enhancing productivity and reducing the time spent searching for information.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Vercel Ai Docs Mcp
vercel-ai-docs-mcp is a Model Context Protocol (MCP) server designed to provide AI-powered search and querying capabilities specifically for the Vercel AI SDK documentation. It allows developers to ask questions and receive accurate, contextualized answers based on the official documentation.
Use cases
Use cases for vercel-ai-docs-mcp include assisting developers in troubleshooting issues with the Vercel AI SDK, providing quick answers to common questions, and serving as a learning tool for new developers trying to understand the SDK’s functionalities.
How to use
To use vercel-ai-docs-mcp, developers need to set up the server by installing the necessary prerequisites such as Node.js and npm, and obtaining a Google API key for the Gemini model. After cloning the repository and configuring the environment variables, the server can be run to enable querying the Vercel AI SDK documentation.
Key features
Key features of vercel-ai-docs-mcp include direct documentation search using similarity search, an AI-powered agent for natural language questions, session management to maintain conversation context, and automated indexing tools for the latest documentation.
Where to use
vercel-ai-docs-mcp is primarily used in software development environments where developers need quick and accurate access to the Vercel AI SDK documentation, enhancing productivity and reducing the time spent searching for information.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
Vercel AI SDK Documentation MCP Agent
A Model Context Protocol (MCP) server that provides AI-powered search and querying capabilities for the Vercel AI SDK documentation. This project enables developers to ask questions about the Vercel AI SDK and receive accurate, contextualized responses based on the official documentation.
Features
- Direct Documentation Search: Query the Vercel AI SDK documentation index directly using similarity search
- AI-Powered Agent: Ask natural language questions about the Vercel AI SDK and receive comprehensive answers
- Session Management: Maintain conversation context across multiple queries
- Automated Indexing: Includes tools to fetch, process, and index the latest Vercel AI SDK documentation
Architecture
This system consists of several key components:
- MCP Server: Exposes tools via the Model Context Protocol for integration with AI assistants
- DocumentFetcher: Crawls and processes the Vercel AI SDK documentation
- VectorStoreManager: Creates and manages the FAISS vector index for semantic search
- AgentService: Provides AI-powered answers to questions using the Google Gemini model
- DirectQueryService: Offers direct semantic search of the documentation
Setup Instructions
Prerequisites
- Node.js 18+
- npm
- A Google API key for Gemini model access
Environment Variables
Create a .env file in the project root with the following variables:
GOOGLE_GENERATIVE_AI_API_KEY=your-google-api-key-here
You’ll need to obtain a Google Gemini API key from the Google AI Studio.
Installation
-
Clone the repository
git clone https://github.com/IvanAmador/vercel-ai-docs-mcp.git cd vercel-ai-docs-mcp-agent -
Install dependencies
npm install -
Build the project
npm run build -
Build the documentation index
npm run build:index -
Start the MCP server
npm run start
Integration with Claude Desktop
Claude Desktop is a powerful AI assistant that supports MCP servers. To connect the Vercel AI SDK Documentation MCP agent with Claude Desktop:
-
First, install Claude Desktop if you don’t have it already.
-
Open Claude Desktop settings (via the application menu, not within the chat interface).
-
Navigate to the “Developer” tab and click “Edit Config”.
-
Add the Vercel AI Docs MCP server to your configuration:
{
"mcpServers": {
"vercel-ai-docs": {
"command": "node",
"args": [
"ABSOLUTE_PATH_TO_PROJECT/dist/main.js"
],
"env": {
"GOOGLE_GENERATIVE_AI_API_KEY": "your-google-api-key-here"
}
}
}
}
Make sure to replace:
ABSOLUTE_PATH_TO_PROJECTwith the actual path to your project folderyour-google-api-key-herewith your Google Gemini API key
-
Save the config file and restart Claude Desktop.
-
To verify the server is connected, look for the hammer 🔨 icon in the Claude chat interface.
For more detailed information about setting up MCP servers with Claude Desktop, visit the MCP Quickstart Guide.
Integration with Other MCP Clients
This MCP server is compatible with any client that implements the Model Context Protocol. Here are a few examples:
Cursor
Cursor is an AI-powered code editor that supports MCP servers. To integrate with Cursor:
-
Add a
.cursor/mcp.jsonfile to your project directory (for project-specific configuration) or a~/.cursor/mcp.jsonfile in your home directory (for global configuration). -
Add the following to your configuration file:
{
"mcpServers": {
"vercel-ai-docs": {
"command": "node",
"args": [
"ABSOLUTE_PATH_TO_PROJECT/dist/main.js"
],
"env": {
"GOOGLE_GENERATIVE_AI_API_KEY": "your-google-api-key-here"
}
}
}
}
For more information about using MCP with Cursor, refer to the Cursor MCP documentation.
Usage
The MCP server exposes three primary tools:
1. agent-query
Query the Vercel AI SDK documentation using an AI agent that can search and synthesize information.
{
"name": "agent-query",
"arguments": {
"query": "How do I use the streamText function?",
"sessionId": "unique-session-id"
}
}
2. direct-query
Perform a direct similarity search against the Vercel AI SDK documentation index.
{
"name": "direct-query",
"arguments": {
"query": "streamText usage",
"limit": 5
}
}
3. clear-memory
Clears the conversation memory for a specific session or all sessions.
{
"name": "clear-memory",
"arguments": {
"sessionId": "unique-session-id"
}
}
To clear all sessions, omit the sessionId parameter.
Development
Project Structure
├── config/ # Configuration settings ├── core/ # Core functionality │ ├── indexing/ # Document indexing and vector store │ └── query/ # Query services (agent and direct) ├── files/ # Storage directories │ ├── docs/ # Processed documentation │ ├── faiss_index/ # Vector index files │ └── sessions/ # Session data ├── mcp/ # MCP server and tools │ ├── server.ts # MCP server implementation │ └── tools/ # MCP tool definitions ├── scripts/ # Build and utility scripts └── utils/ # Helper utilities
Build Scripts
npm run build: Compile TypeScript filesnpm run build:index: Build the documentation indexnpm run dev:index: Build and index in development modenpm run dev: Build and start in development mode
Troubleshooting
Common Issues
-
Index not found or failed to load
Run
npm run build:indexto create the index before starting the server. -
API rate limits
When exceeding Google API rate limits, the agent service may return errors. Implement appropriate backoff strategies.
-
Model connection issues
Ensure your Google API key is valid and has access to the specified Gemini model.
-
Claude Desktop not showing MCP server
- Check your configuration file for syntax errors.
- Make sure the path to the server is correct and absolute.
- Check Claude Desktop logs for errors.
- Restart Claude Desktop after making configuration changes.
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
MIT
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










