- Explore MCP Servers
- atlas-docs-mcp
Atlas Docs MCP Server
What is Atlas Docs MCP Server
Atlas Docs MCP is a server designed to provide AI assistants with comprehensive technical documentation for various libraries and frameworks. It leverages the Model Context Protocol (MCP) to ensure LLMs have access to the most relevant documentation necessary for accurate code generation and library usage.
Use cases
This server is particularly beneficial for AI models when generating code that utilizes less popular or newly released libraries. It allows models to reference documentation directly, improving the accuracy of code snippets and reducing errors associated with unfamiliar libraries.
How to use
To integrate Atlas Docs MCP into your workflow, you need to configure your MCP client (like Cursor, Cline, or Windsurf) by adding a specific server configuration in the JSON file. Additionally, you can install it automatically using Smithery by running a command tailored to your client.
Key features
Key features of Atlas Docs MCP include the ability to provide processed documentation in a clean markdown format, support for various MCP-compatible clients, and a range of available tools such as listing documentation sets, retrieving indices, conducting keyword searches, and fetching specific documentation pages.
Where to use
Atlas Docs MCP can be utilized in environments that support MCP clients integrating various libraries and frameworks. It’s particularly useful for AI development, programming education, and any application where accurate documentation access is essential for efficient coding and learning.
Overview
What is Atlas Docs MCP Server
Atlas Docs MCP is a server designed to provide AI assistants with comprehensive technical documentation for various libraries and frameworks. It leverages the Model Context Protocol (MCP) to ensure LLMs have access to the most relevant documentation necessary for accurate code generation and library usage.
Use cases
This server is particularly beneficial for AI models when generating code that utilizes less popular or newly released libraries. It allows models to reference documentation directly, improving the accuracy of code snippets and reducing errors associated with unfamiliar libraries.
How to use
To integrate Atlas Docs MCP into your workflow, you need to configure your MCP client (like Cursor, Cline, or Windsurf) by adding a specific server configuration in the JSON file. Additionally, you can install it automatically using Smithery by running a command tailored to your client.
Key features
Key features of Atlas Docs MCP include the ability to provide processed documentation in a clean markdown format, support for various MCP-compatible clients, and a range of available tools such as listing documentation sets, retrieving indices, conducting keyword searches, and fetching specific documentation pages.
Where to use
Atlas Docs MCP can be utilized in environments that support MCP clients integrating various libraries and frameworks. It’s particularly useful for AI development, programming education, and any application where accurate documentation access is essential for efficient coding and learning.
Content
Atlas Docs MCP Server
A Model Context Protocol (MCP) server that provides AI assistants with documentation for libraries and frameworks.
[!WARNING]
Atlas Docs is currently in beta. Not everything might work perfectly, but we’re actively improving the service. Your patience and feedback are greatly appreciated!
What Does This Server Do?
LLMs are great at generating general code, but suck at correctly using less popular or newly released libraries. This isn’t surprising, since the models have not been trained comprehensively on code using these libraries.
Atlas Docs MCP server:
- Provides technical documentation for libraries and frameworks
- Processes the official docs into a clean markdown version for LLM consumption
- Is easy to set up with Cursor, Cline, Windsurf and any other MCP-compatible LLM clients
Claude 3.5 Sonnet on its own:
Claude 3.5 Sonnet with Atlas Docs MCP:
📦 Installation
Atlas Docs MCP server works with any MCP client that supports the stdio
protocol, including:
- Cursor
- Cline
- Windsurf
- Claude Desktop
Add the following to your MCP client configuration file:
{
"mcpServers": {
"atlas-docs": {
"command": "npx",
"args": [
"-y",
"@cartographai/atlas-docs-mcp"
]
}
}
}
That’s it! You may need to restart the app (for Claude Desktop) for the server to be recognised.
Tip: Prompt your model to check the docs eg. “Use the tools to check the documentation for Astro to ensure that you use the library correctly.”
Installing via Smithery
Alternatively, you can install Atlas Docs MCP automatically via Smithery. Example for claude desktop:
npx -y @smithery/cli install @CartographAI/atlas-docs-mcp --client claude
Change “claude” to “cursor”, “cline” or “windsurf” for the respective clients.
📒 Available Libraries
- AI-SDK (source: https://sdk.vercel.ai/docs/introduction)
- Astro (source: https://docs.astro.build/en/getting-started)
- ast-grep (source: https://ast-grep.github.io/llms.txt)
- Bun (source: https://bun.sh/llms.txt)
- CrewAI (source: https://docs.crewai.com/llms.txt)
- Drizzle (source: https://orm.drizzle.team/llms.txt)
- ElevenLabs (source: https://elevenlabs.io/docs/llms.txt)
- Fireworks (source: https://docs.fireworks.ai/llms.txt)
- Hono (source: https://hono.dev/llms.txt)
- Langgraph-js (source: https://langchain-ai.github.io/langgraphjs/llms.txt)
- Langgraph-py (source: https://langchain-ai.github.io/langgraph/llms.txt)
- Mastra (source: https://mastra.ai/llms.txt)
- ModelContextProtocol (source: https://modelcontextprotocol.io/llms.txt)
- Pglite (source: https://pglite.dev/docs/about)
- Prisma (source: https://www.prisma.io/docs/llms.txt)
- Resend (source: https://resend.com/docs/llms.txt)
- shadcn/ui (source: https://ui.shadcn.com/docs)
- Stripe (source: https://docs.stripe.com/llms.txt)
- Svelte (source: https://svelte.dev/docs/svelte/overview)
- SvelteKit (source: https://svelte.dev/docs/kit/introduction)
- tailwindcss (source: https://tailwindcss.com/docs/installation)
- TanStack-Router (source: https://tanstack.com/router/latest/docs/framework/react/overview)
- Trigger.dev (source: https://trigger.dev/docs/llms.txt)
- X (source: https://docs.x.com/llms.txt)
- Zapier (source: https://docs.zapier.com/llms.txt)
Want docs for another library not in this list? Please open an issue in this repo, we’ll try to process and add it!
🔨 Available Tools
list_docs
: List all available documentation setsget_docs_index
: Retrieves a condensed, LLM-friendly index of a documentation setget_docs_full
: Retrieves a complete documentation set in a single consolidated filesearch_docs
: Search a documentation set by keywordsget_docs_page
: Retrieves a specific page of a documentation set
💭 How It Works
Atlas Docs processes tech libraries’ documentation sites into clean, markdown versions. This MCP server provides the docs as MCP tools, calling Atlas Docs APIs for the data.
Running the backend locally
Please visit CartographAI/atlas and follow the instructions in the README.
Update ATLAS_API_URL with the url of your deployment.
Support & Feedback
Please open an issue in this repo to request docs for a library, or to report a bug.
If you have any questions, feedback, or just want to say hi, we’d love to hear from you. You can find us on Cartograph’s Discord comunity for real-time support, or email us at [email protected]