MCP ExplorerExplorer

Mcp Chat Example

@integration-appon 25 days ago
1 MIT
FreeCommunity
AI Systems
Example chat application that implements an MCP Client to expose an LLM to tools powered by Integration App

Overview

What is Mcp Chat Example

MCP-chat-example is a chat application that utilizes the Model Control Protocol (MCP) to connect users with AI models, specifically using Anthropic’s Claude as the language model. It provides a user-friendly web interface for interaction.

Use cases

Use cases include automated customer service chatbots, interactive learning assistants, virtual personal assistants, and tools for developers to test and interact with AI models.

How to use

To use MCP-chat-example, clone the repository, install the necessary dependencies, configure your environment variables including the MCP server URL and API keys, and run the application to start chatting with the AI model.

Key features

Key features include a user-friendly chat interface, integration with MCP for accessing various tools, the ability to process user input, select tools dynamically, execute commands, and generate responses using a powerful language model.

Where to use

MCP-chat-example can be used in various fields such as customer support, educational tools, personal assistants, and any application requiring natural language interaction with AI.

Content

AI Chat Agent Example

This is a template for an application showcasing integration capabilities using Integration.app. The app is built with Next.js and demonstrates how to implement user authentication and token generation, as well as an AI-powered chat agent that can use tools from an MCP server.

Prerequisites

  • Node.js 18+ installed
  • Integration.app workspace credentials (Workspace Key and Secret)
  • LLM Credentials (Key and Model/Provider Name)
  • MCP Server access (See here to run MCP Server)

Setup

  1. Clone the repository:
git clone https://github.com/integration-app/MCP-chat-example
cd MCP-chat-example
  1. Install dependencies:
npm install
# or
yarn install
  1. Set up environment variables:
# Copy the sample environment file
cp .env-sample .env
  1. Edit .env and add your Integration.app credentials and LLM API keys:
INTEGRATION_APP_WORKSPACE_KEY=YOUR_KEY_HERE
INTEGRATION_APP_WORKSPACE_SECRET=YOUR_SECRET_HERE
OPENAI_API_KEY=YOUR_KEY_HERE
ANTHROPIC_API_KEY=YOUR_KEY_HERE
LLM_PROVIDER=openai # or anthropic
LLM_MODEL=gpt-4o # or your preferred model name

You can find these credentials in your Integration.app workspace settings.

Running the Application

  1. Start the development server:
npm run dev
# or
yarn dev
  1. Open http://localhost:3000 in your browser.

Project Structure

  • /src/app - Next.js app router pages and API routes
    • /integrations - Integration connection UI
    • /chat - AI chat agent page (uses MCP tools)
    • /tools - List of available MCP tools
    • /api - Backend API routes for integration token and tool listing
  • /src/components - Reusable React components
  • /src/lib - Utility functions and helpers (including MCP agent logic)
  • /public - Static assets

Authentication

The template implements a simple authentication mechanism using a randomly generated UUID as the customer ID. This simulates a real-world scenario where your application can authentication users to provide the agent access to tools in their own personal accounts on external apps. The customer ID is used to:

  • Identify the user/customer in the Integration.app workspace
  • Generate integration app tokens for external app connections

AI Chat Agent

The chat agent uses the latest LangChain and LangGraph libraries to connect to an MCP server and invoke tools from connected integrations. You can control the LLM provider and model via environment variables.


Tools

No tools

Comments

Recommend MCP Servers

View All MCP Servers