MCP ExplorerExplorer

Demo Local Streamable Http Mcp With Langgraph Bedrock

@mpontingawson a year ago
1 MIT
FreeCommunity
AI Systems
Local demo of a MCP server with streamable HTTP transport that is built off the langgraph example. Integrates Amazon Bedrock instead of openAI.

Overview

What is Demo Local Streamable Http Mcp With Langgraph Bedrock

demo-local-streamable-http-mcp-with-langgraph-bedrock is a local demonstration of an MCP server that utilizes streamable HTTP transport, built upon the LangGraph example. It integrates Amazon Bedrock instead of OpenAI, showcasing a stateless server architecture.

Use cases

Use cases include real-time data processing applications, microservices architecture, and scenarios requiring high scalability and low latency in request handling.

How to use

To use the demo, start by launching the MCP Server using the command ‘uv run mcp-simple-streamablehttp-stateless’. Ensure you are in the correct directory and that the server is running on a local address, typically ‘http://0.0.0.0:3000’. You can then test the server using a Python client.

Key features

Key features include stateless operation with StreamableHTTP transport, ephemeral connections for each request, no session state maintained, and task lifecycle scoped to individual requests. It is designed for deployment in multi-node environments.

Where to use

This MCP server can be used in various fields such as cloud computing, data processing, and application development where stateless interactions are beneficial, particularly in multi-node setups.

Content

Demo Local Streamable HTTP MCP with LangGraph and Bedrock LLM

Adapted from the Python SDK for MCP and the LangChain MCP Adapters.

Steps to run locally

  1. Start the MCP Server
    • Validate the MCP Server is running with MCP Inspector
  2. Run the python client to test the MCP server

MCP Server

Adapted from the LangGraph MCP Simple StreamableHttp Stateless Server Example

A stateless MCP server example demonstrating the StreamableHttp transport without maintaining session state. This example is ideal for understanding how to deploy MCP servers in multi-node environments where requests can be routed to any instance. In the example in this repo, only a single MCP server node is being used.

Features

  • Uses the StreamableHTTP transport in stateless mode (mcp_session_id=None)
  • Each request creates a new ephemeral connection
  • No session state maintained between requests
  • Task lifecycle scoped to individual requests
  • Suitable for deployment in multi-node environments

The server exposes 2 tools.

  • Add
  • Multiply

Usage

  1. In the terminal, make sure you are in overall folder for the sample. Do not navigate to the folder with the server.py file in it.

  2. Start the server

    # Using default port 3000
    uv run mcp-simple-streamablehttp-stateless
    
    # [Optional] Using custom port
    uv run mcp-simple-streamablehttp-stateless --port 3000
    
    # [Optional] Custom logging level
    uv run mcp-simple-streamablehttp-stateless --log-level DEBUG
    
    # [Optional] Enable JSON responses instead of SSE streams
    uv run mcp-simple-streamablehttp-stateless --json-response
    
  3. If it is running correctly, you should see that the server is now running at a local address like http://0.0.0.0:3000. The MCP server will be accessible on the /mcp path.

Make sure to leave the terminal with the server running open to use it in the inspector and client.

MCP Inspector

https://modelcontextprotocol.io/docs/tools/inspector

  1. Run the inspector

        # This will start the inspector on a different port than the server is running
        npx @modelcontextprotocol/inspector
    
  2. For Transport Type, select Streamable HTTP

  3. For URL, add http://0.0.0.0:3000/mcp or whatever localhost + port combo you’re using

Client

A Streamable HTTP Client written in Python that serves as a base file to test the HTTP transport method. This is very barebones but showcases how to use the Python SDK mcp.client.streamable_http.

This local client invokes the LangGraph agent that uses the Amazon Bedrock model for create_react_agent.

Usage

You can either use the client.py or the client-readable.py file to test the MCP server.

  • Option 1: client.py = This is what you’ll see in most examples, it has a print statement at the end to see the full response output from the invoke of the agent.
  • Option 2: client-readable.py = Exact same functionality as the client.py file except that it has some additional code added to make the overall output and calling of the agent readable in the terminal output.

Prerequisites

  1. AWS CLI credentials must allow access to Amazon Bedrock model that is called in the client.

    • Run aws configure in the CLI to verify you have the correct access set up or that you have a profile assumed that has the needed ~/.aws/credentials set (MacOS).
  2. Run the following command to test the server.

        # Option 1
        python client.py
    
        # Option 2
        python client-readable.py
    

To test different responses from the agent, adjust the line in the file that has the agent.ainvoke() to have a different message.

Tools

No tools

Comments

Recommend MCP Servers

View All MCP Servers