- Explore MCP Servers
- langgraph-rag-mcp
Langgraph Rag Mcp
What is Langgraph Rag Mcp
langgraph-rag-mcp is a Retrieval-Augmented Generation (RAG) system designed to serve LangGraph documentation through the Model Context Protocol (MCP). It collects, processes, and retrieves documentation to provide context-aware responses.
Use cases
Use cases include providing instant documentation support in software development environments, enhancing chatbot responses with relevant documentation, and facilitating educational platforms that require quick access to reference materials.
How to use
To use langgraph-rag-mcp, clone the repository, set up a Python virtual environment, install the required packages, and run the MCP server to expose the retrieval function for use with compatible hosts.
Key features
Key features include documentation collection and processing, a semantic vector database for efficient retrieval, integration with language models for context-aware responses, and MCP server integration for easy access.
Where to use
langgraph-rag-mcp can be used in various fields such as software documentation, customer support, educational tools, and any application requiring enhanced information retrieval and contextual understanding.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Langgraph Rag Mcp
langgraph-rag-mcp is a Retrieval-Augmented Generation (RAG) system designed to serve LangGraph documentation through the Model Context Protocol (MCP). It collects, processes, and retrieves documentation to provide context-aware responses.
Use cases
Use cases include providing instant documentation support in software development environments, enhancing chatbot responses with relevant documentation, and facilitating educational platforms that require quick access to reference materials.
How to use
To use langgraph-rag-mcp, clone the repository, set up a Python virtual environment, install the required packages, and run the MCP server to expose the retrieval function for use with compatible hosts.
Key features
Key features include documentation collection and processing, a semantic vector database for efficient retrieval, integration with language models for context-aware responses, and MCP server integration for easy access.
Where to use
langgraph-rag-mcp can be used in various fields such as software documentation, customer support, educational tools, and any application requiring enhanced information retrieval and contextual understanding.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
LangGraph RAG MCP
A Retrieval-Augmented Generation (RAG) system that serves LangGraph documentation through the Model Context Protocol (MCP).
Overview
This project builds a documentation retrieval system that:
- Collects and processes LangGraph documentation from the official website
- Creates a vector database from this documentation for semantic search
- Exposes this knowledge through the Model Context Protocol (MCP)
- Integrates with MCP-compatible hosts like VS Code, Cursor, Claude Desktop, or Windsurf
How It Works
1. Documentation Collection and Processing (Context)
- Recursively scrapes and cleans LangGraph documentation from multiple website URLs using
RecursiveUrlLoaderand BeautifulSoup. - Splits text into manageable chunks using
RecursiveCharacterTextSplitterwithtiktokenfor accurate token counting. - Embeds chunks into vector representations using
BAAI/bge-large-en-v1.5embeddings. - Stores vectors in an
SKLearnVectorStorefor efficient retrieval.
2. Retrieval System (Tool)
- Implements a retrieval function that finds the most relevant documentation chunks for a given query.
- Integrates this function with language models like Claude to provide context-aware responses.
- Returns formatted responses that include source attribution and relevant context.
3. MCP Server Integration
- Wraps the retrieval tool in an MCP server using the
fastmcplibrary. - Exposes the retrieval function as a tool that MCP-compatible hosts can use.
- Provides access to both the retrieval system and additional resources (like the full documentation file).
Requirements
- Python 3.10+
- Docker and Docker Compose (recommended)
- Anthropic API key (for Claude models)
Installation and Setup
You can run this project either with Docker (recommended) or in a local Python environment.
Using Docker (Recommended)
-
Clone this repository:
git clone https://github.com/yourusername/langraph-rag-mcp.git cd langraph-rag-mcp -
Set up your API keys in a
.envfile:
Create a.envfile in the project root and add your Anthropic API key:echo "ANTHROPIC_API_KEY=your_api_key_here" > .envThe
docker-compose.ymlfile will automatically load this environment variable.
Local Environment (without Docker)
-
Clone this repository:
git clone https://github.com/yourusername/langraph-rag-mcp.git cd langraph-rag-mcp -
Create and activate a virtual environment:
conda create -n mcp python=3.13 conda activate mcp -
Install the required packages:
pip install -r requirements.txt -
Set up your API keys in a
.envfile:echo "ANTHROPIC_API_KEY=your_api_key_here" > .env
Usage
The process involves two main steps: first, generating the vector store, and second, running the MCP server.
Step 1: Generate the Vector Store
You only need to do this once, or whenever you want to update the documentation.
- Open and run the
rag-tool.ipynbnotebook in a Jupyter environment. - This will:
- Download the latest LangGraph documentation.
- Save the full documentation to
llms_full.txt. - Split the documents into chunks.
- Create and persist a vector store at
sklearn_vectorstore.parquet.
Step 2: Run the MCP Server
With Docker
The easiest way to run the server is using the provided shell script, which wraps Docker Compose.
bash run-mcp-docker.sh
This script will build the Docker image if it doesn’t exist, start the container, and then execute the MCP server inside it, correctly handling standard I/O for MCP communication.
Without Docker
If you are not using Docker, you can run the MCP server directly in dev mode through the command:
mcp dev langgraph-mcp.py
Configuring MCP Hosts
To use this MCP server with a compatible editor, you need to configure it.
VS Code
- Open your VS Code
settings.jsonfile. (You can find it via the command palette:Preferences: Open User Settings (JSON)). - Add the following configuration to the file. Make sure to replace
<path-to-your-project>with the absolute path to thelangraph-rag-mcpdirectory on your machine.
If you are not using Docker, change the command to:
System Architecture
(Phase 1: Data Ingestion - Performed once on Host Machine) ┌─────────────┐ ┌──────────────────┐ ┌───────────────────────────────┐ │ LangGraph │ │ Jupyter Notebook │ │ Vector Store & Full Docs │ │ Docs (Web) │───▶ │ (rag-tool.ipynb) │───▶ │ (.parquet & .txt files) │ └─────────────┘ └──────────────────┘ └───────────────────────────────┘ (Phase 2: Live RAG System - Request/Response Flow) ┌──────────────┐ ┌─────────────┐ │ VS Code │ │ .env file │ │(User Interface)│ │ (API KEY) │ └──────┬───────┘ └──────┬──────┘ │ 1. User Query │ (provides) ▼ │ ┌──────┴───────────────────────────────────────────┴───────────────────────────────┐ │ Host Machine Boundary │ │ │ │ ┌────────────────────┐ ┌─────────────────────────────────────────┐ │ │ │ run-mcp-docker.sh │ 2. Execs │ 🐳 Docker Container │ │ │ │ (Entrypoint Script)│────▶ │ │ │ │ └────────────────────┘ │ ┌───────────────────────────────────┐ │ │ │ │ │ 🐍 Python MCP Server │ │ │ │ ▲ ◀─┼──│ (langgraph-mcp.py) │ │ │ │ │ 7. Final Response │ └───────────────┬───────────────────┘ │ │ │ │ │ │ 3. Reads Data From │ │ │ └───────────────────────────────│──────────────────│─────────────────────┘ │ │ │ │ │ │ │ ▼ │ │ │ ┌───────────────────────────────────┐ │ │ (files mounted from Host) ············ │ Mounted Vector Store & Docs │ │ │ │ └───────────────────────────────────┘ │ │ └─────────────────────────────────────────┘ │ │ │ └──────────────────────────────────────────────────────────────────────────────────┘ │ 4. API Call │ (sends augmented prompt) ▼ ┌────────────────────┐ │ ☁️ Anthropic API │ │ (Claude LLM) │ └──────────┬─────────┘ │ 5. Generation │ ◀························ 6. Returns Response (to MCP Server)
Resources
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










