- Explore MCP Servers
- mcp-intro
Mcp Intro
What is Mcp Intro
mcp-intro is a tutorial project that demonstrates how to integrate Model Context Protocol (MCP) servers with Langgraph agents to create AI applications, featuring a data science assistant named Scout.
Use cases
Use cases include managing data science projects, real-time data querying, and enhancing user interaction with AI through conversational interfaces.
How to use
To use mcp-intro, clone the repository, set up a virtual environment, install dependencies, and configure environment variables with your OpenAI API key and other necessary tokens.
Key features
Key features include the use of GPT-4.1 as the base model, integration with multiple MCP servers, orchestration of conversation flow using Langgraph, and a streaming interface for real-time responses.
Where to use
mcp-intro can be used in data science projects, AI application development, and any domain requiring conversational AI capabilities and tool integration.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Mcp Intro
mcp-intro is a tutorial project that demonstrates how to integrate Model Context Protocol (MCP) servers with Langgraph agents to create AI applications, featuring a data science assistant named Scout.
Use cases
Use cases include managing data science projects, real-time data querying, and enhancing user interaction with AI through conversational interfaces.
How to use
To use mcp-intro, clone the repository, set up a virtual environment, install dependencies, and configure environment variables with your OpenAI API key and other necessary tokens.
Key features
Key features include the use of GPT-4.1 as the base model, integration with multiple MCP servers, orchestration of conversation flow using Langgraph, and a streaming interface for real-time responses.
Where to use
mcp-intro can be used in data science projects, AI application development, and any domain requiring conversational AI capabilities and tool integration.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
MCP-Langgraph Integration Tutorial
This tutorial demonstrates how to integrate Model Context Protocol (MCP) servers with Langgraph agents to create powerful, tool-enabled AI applications. The project showcases a data science assistant named Scout that can help users manage their data science projects using various MCP-powered tools.
Overview
The project implements a conversational AI agent that:
- Uses GPT-4.1 as the base model
- Integrates with multiple MCP servers for different functionalities
- Uses Langgraph for orchestrating the conversation flow
- Provides a streaming interface for real-time responses
Prerequisites
- Python 3.13+
- Node.js (for filesystem MCP server)
- Docker (for GitHub MCP server)
- UV package manager
- OpenAI API key
Project Structure
scout/ ├── graph.py # Langgraph agent implementation ├── client.py # MCP client and streaming interface ├── client_utils.py # Utility functions ├── main.py # Entry point └── my_mcp/ # MCP server configurations ├── config.py # Config loading and env var resolution ├── mcp_config.json # MCP server definitions └── local_servers/ # Custom MCP server implementations
Setup
- Clone the repository:
git clone <repository-url>
cd mcp-intro
- Create and activate a virtual environment:
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
- Install dependencies:
uv pip install -e .
- Set up environment variables:
Create a.envfile with:
OPENAI_API_KEY=your_openai_api_key MCP_FILESYSTEM_DIR=/path/to/projects/directory MCP_GITHUB_PAT=your_github_personal_access_token
MCP Servers
This project integrates with four MCP servers:
- Dataflow Server: Custom implementation for data loading and querying
- Filesystem Server: Uses
@modelcontextprotocol/server-filesystemfor file operations - Git Server: Uses
mcp-server-gitfor local git operations - GitHub Server: Uses the official GitHub MCP server for GitHub operations
Usage
- Start the application:
python -m scout.client
- Interact with Scout by typing your questions or requests. For example:
USER: Can you help me set up a new data science project?
- Scout will use its tools to:
- Create and manage project directories
- Handle data loading and transformation
- Manage version control
- Interact with GitHub repositories
- Type ‘quit’ or ‘exit’ to end the session.
How It Works
- The
graph.pyfile defines the Langgraph agent structure:
- Sets up the system prompt and agent state
- Configures the LLM (GPT-4)
- Defines the conversation flow graph
- The
client.pyfile:
- Initializes the MCP client with multiple servers
- Handles streaming responses
- Manages the interactive session
- MCP servers provide tools for:
- File system operations
- Data manipulation
- Git operations
- GitHub interactions
Extending the Project
You can extend this project by:
- Adding new MCP servers in
my_mcp/local_servers/ - Modifying the system prompt in
graph.py - Adding new tools to the agent
- Customizing the conversation flow
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










