- Explore MCP Servers
- langchain-mcp
Langchain Mcp
What is Langchain Mcp
langchain-mcp is a LangChain agent that utilizes Model Context Protocol (MCP) servers for tool integration, allowing interaction with various services like web search, weather information retrieval, and mathematical expression evaluation.
Use cases
Use cases for langchain-mcp include building chatbots that can answer queries using web search, applications that provide weather forecasts, and tools that perform mathematical calculations based on user input.
How to use
To use langchain-mcp, clone the repository, create a virtual environment, install the dependencies, configure the necessary API keys in a .env file, and run the agent from the command line using ‘python src/agent.py’.
Key features
Key features include graceful shutdown with proper signal handling, subprocess management for tracking MCP server processes, robust error handling, and a modular design that allows easy extension with additional MCP servers.
Where to use
langchain-mcp can be used in various fields such as web development, data analysis, and any application requiring integration with external services for search, weather data, or mathematical computations.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Langchain Mcp
langchain-mcp is a LangChain agent that utilizes Model Context Protocol (MCP) servers for tool integration, allowing interaction with various services like web search, weather information retrieval, and mathematical expression evaluation.
Use cases
Use cases for langchain-mcp include building chatbots that can answer queries using web search, applications that provide weather forecasts, and tools that perform mathematical calculations based on user input.
How to use
To use langchain-mcp, clone the repository, create a virtual environment, install the dependencies, configure the necessary API keys in a .env file, and run the agent from the command line using ‘python src/agent.py’.
Key features
Key features include graceful shutdown with proper signal handling, subprocess management for tracking MCP server processes, robust error handling, and a modular design that allows easy extension with additional MCP servers.
Where to use
langchain-mcp can be used in various fields such as web development, data analysis, and any application requiring integration with external services for search, weather data, or mathematical computations.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
LangChain Agent with MCP Servers
A LangChain agent using MCP Adapters for tool integration with Model Context Protocol (MCP) servers.
Overview
This project demonstrates how to build a LangChain agent that uses the Model Context Protocol (MCP) to interact with various services:
- Tavily Search: Web search and news search capabilities
- Weather: Mock weather information retrieval
- Math: Mathematical expression evaluation
The agent uses LangGraph’s ReAct agent pattern to dynamically select and use these tools based on user queries.
Features
- Graceful Shutdown: All MCP servers implement proper signal handling for clean termination
- Subprocess Management: The main agent tracks and manages all MCP server subprocesses
- Error Handling: Robust error handling throughout the application
- Modular Design: Easy to extend with additional MCP servers
Graceful Shutdown Mechanism
This project implements a comprehensive graceful shutdown system:
- Signal Handling: Captures SIGINT and SIGTERM signals to initiate graceful shutdown
- Process Tracking: The main agent maintains a registry of all child processes
- Cleanup Process: Ensures all subprocesses are properly terminated on exit
- Shutdown Flags: Each MCP server has a shutdown flag to prevent new operations when shutdown is initiated
- Async Cooperation: Uses asyncio to allow operations in progress to complete when possible
Installation
# Clone the repository
git clone https://github.com/yourusername/langchain-mcp.git
cd langchain-mcp
# Create a virtual environment
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
# Install dependencies
pip install -e .
Configuration
Create a .env file in the project root with the following variables:
OPENAI_API_KEY=your_openai_api_key TAVILY_API_KEY=your_tavily_api_key
Usage
Run the agent from the command line:
python src/agent.py
The agent will prompt for your query and then process it using the appropriate tools.
Development
To add a new MCP server:
- Create a new file in
src/mcpserver/ - Implement the server with proper signal handling
- Update
src/mcpserver/__init__.pyto expose the new server - Add the server configuration to
src/agent.py
License
MIT
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










