MCP ExplorerExplorer

Langchain Mcp

@esakrissaon 9 months ago
1 MIT
FreeCommunity
AI Systems
LangChain Agent with MCP Servers: Using LangChain MCP Adapters for tool integration.

Overview

What is Langchain Mcp

langchain-mcp is a LangChain agent that utilizes Model Context Protocol (MCP) servers for tool integration, allowing interaction with various services like web search, weather information retrieval, and mathematical expression evaluation.

Use cases

Use cases for langchain-mcp include building chatbots that can answer queries using web search, applications that provide weather forecasts, and tools that perform mathematical calculations based on user input.

How to use

To use langchain-mcp, clone the repository, create a virtual environment, install the dependencies, configure the necessary API keys in a .env file, and run the agent from the command line using ‘python src/agent.py’.

Key features

Key features include graceful shutdown with proper signal handling, subprocess management for tracking MCP server processes, robust error handling, and a modular design that allows easy extension with additional MCP servers.

Where to use

langchain-mcp can be used in various fields such as web development, data analysis, and any application requiring integration with external services for search, weather data, or mathematical computations.

Content

LangChain Agent with MCP Servers

A LangChain agent using MCP Adapters for tool integration with Model Context Protocol (MCP) servers.

Overview

This project demonstrates how to build a LangChain agent that uses the Model Context Protocol (MCP) to interact with various services:

  • Tavily Search: Web search and news search capabilities
  • Weather: Mock weather information retrieval
  • Math: Mathematical expression evaluation

The agent uses LangGraph’s ReAct agent pattern to dynamically select and use these tools based on user queries.

Features

  • Graceful Shutdown: All MCP servers implement proper signal handling for clean termination
  • Subprocess Management: The main agent tracks and manages all MCP server subprocesses
  • Error Handling: Robust error handling throughout the application
  • Modular Design: Easy to extend with additional MCP servers

Graceful Shutdown Mechanism

This project implements a comprehensive graceful shutdown system:

  1. Signal Handling: Captures SIGINT and SIGTERM signals to initiate graceful shutdown
  2. Process Tracking: The main agent maintains a registry of all child processes
  3. Cleanup Process: Ensures all subprocesses are properly terminated on exit
  4. Shutdown Flags: Each MCP server has a shutdown flag to prevent new operations when shutdown is initiated
  5. Async Cooperation: Uses asyncio to allow operations in progress to complete when possible

Installation

# Clone the repository
git clone https://github.com/yourusername/langchain-mcp.git
cd langchain-mcp

# Create a virtual environment
python -m venv .venv
source .venv/bin/activate  # On Windows: .venv\Scripts\activate

# Install dependencies
pip install -e .

Configuration

Create a .env file in the project root with the following variables:

OPENAI_API_KEY=your_openai_api_key
TAVILY_API_KEY=your_tavily_api_key

Usage

Run the agent from the command line:

python src/agent.py

The agent will prompt for your query and then process it using the appropriate tools.

Development

To add a new MCP server:

  1. Create a new file in src/mcpserver/
  2. Implement the server with proper signal handling
  3. Update src/mcpserver/__init__.py to expose the new server
  4. Add the server configuration to src/agent.py

License

MIT

Tools

No tools

Comments

Recommend MCP Servers

View All MCP Servers