MCP ExplorerExplorer

Mcp Client Chatbot

@cgoingloveon 19 days ago
253 MIT
FreeCommunity
AI Systems
#chatbot#mcp#ollama#nextjs
🚀 Multi-provider AI chatbot client powered by MCP

Overview

What is Mcp Client Chatbot

MCP Client Chatbot is a versatile chat interface that allows users to interact with multiple AI providers such as OpenAI, Anthropic, Google, and Ollama through the Model Context Protocol (MCP).

Use cases

Use cases include automating customer inquiries, generating content based on user prompts, providing personalized recommendations, and facilitating educational interactions through AI-driven conversations.

How to use

To use the MCP Client Chatbot, install the necessary dependencies using pnpm, initialize the project to set up the environment, and start the development server. Access the application at http://localhost:3000.

Key features

Key features include easy integration with various AI providers, a user-friendly chat interface, support for file-based MCP management, and the ability to run locally without complex setup.

Where to use

MCP Client Chatbot can be used in various fields such as customer support, personal assistance, content generation, and educational tools, leveraging AI capabilities for enhanced user interaction.

Content

MCP Client Chatbot

MCP Supported
Discord

Deploy with Vercel

Our goal is to create the best possible chatbot UX — focusing on the joy and intuitiveness users feel when calling and interacting with AI tools.

See the experience in action in the preview below!

Built with Vercel AI SDK and Next.js, this app adopts modern patterns for building AI chat interfaces. It leverages the power of the Model Context Protocol (MCP) to seamlessly integrate external tools into your chat experience.

Table of Contents


Preview

Get a feel for the UX — here’s a quick look at what’s possible.

🧩 Browser Automation with Playwright MCP

playwright-preview

Example: Control a web browser using Microsoft’s playwright-mcp tool.

  • The LLM autonomously decides how to use tools from the MCP server, calling them multiple times to complete a multi-step task and return a final message.

Sample prompt:

Please go to GitHub and visit the cgoinglove/mcp-client-chatbot project.
Then, click on the README.md file.
After that, close the browser.
Finally, tell me how to install the package.

🎙️ Realtime Voice Assistant + MCP Tools

This demo showcases a realtime voice-based chatbot assistant built with OpenAI’s new Realtime API — now extended with full MCP tool integration.
Talk to the assistant naturally, and watch it execute tools in real time.

⚡️ Quick Tool Mentions (@) & Presets

tool-mention

Quickly call any registered MCP tool during chat by typing @toolname.
No need to memorize — just type @ and select from the list!

You can also create tool presets by selecting only the MCP servers or tools you want.
Switch between presets instantly with a click — perfect for organizing tools by task or workflow.

🧭 Tool Choice Mode

tool-mode

Control how tools are used in each chat with Tool Choice Mode — switch anytime with ⌘P.

  • Auto: The model automatically calls tools when needed.
  • Manual: The model will ask for your permission before calling a tool.
  • None: Tool usage is disabled completely.

This lets you flexibly choose between autonomous, guided, or tool-free interaction depending on the situation.


…and there’s even more waiting for you.
Try it out and see what else it can do!


Getting Started

This project uses pnpm as the recommended package manager.

# If you don't have pnpm:
npm install -g pnpm

Quick Start (Docker Compose Version) 🐳

# 1. Install dependencies
pnpm i

# 2. Enter only the LLM PROVIDER API key(s) you want to use in the .env file at the project root.
# Example: The app works with just OPENAI_API_KEY filled in.
# (The .env file is automatically created when you run pnpm i.)

# 3. Build and start all services (including PostgreSQL) with Docker Compose
pnpm docker-compose:up

Quick Start (Local Version) 🚀

# 1. Install dependencies
pnpm i

# 2. Create the environment variable file and fill in your .env values
pnpm initial:env # This runs automatically in postinstall, so you can usually skip it.

# 3. (Optional) If you already have PostgreSQL running and .env is configured, skip this step
pnpm docker:pg

# 4. Run database migrations
pnpm db:migrate

# 5. Start the development server
pnpm dev

# 6. (Optional) Build & start for local production-like testing
pnpm build:local && pnpm start
# Use build:local for local start to ensure correct cookie settings

Open http://localhost:3000 in your browser to get started.


Environment Variables

The pnpm i command generates a .env file. Add your API keys there.

# === LLM Provider API Keys ===
# You only need to enter the keys for the providers you plan to use
GOOGLE_GENERATIVE_AI_API_KEY=****
OPENAI_API_KEY=****
XAI_API_KEY=****
ANTHROPIC_API_KEY=****
OPENROUTER_API_KEY=****
OLLAMA_BASE_URL=http://localhost:11434/api


# Secret for Better Auth (generate with: npx @better-auth/cli@latest secret)
BETTER_AUTH_SECRET=****

# (Optional)
# URL for Better Auth (the URL you access the app from)
BETTER_AUTH_URL=

# === Database ===
# If you don't have PostgreSQL running locally, start it with: pnpm docker:pg
POSTGRES_URL=postgres://your_username:your_password@localhost:5432/your_database_name

# Whether to use file-based MCP config (default: false)
FILE_BASED_MCP_CONFIG=false

# (Optional)
# === OAuth Settings ===
# Fill in these values only if you want to enable Google/GitHub login
GOOGLE_CLIENT_ID=
GOOGLE_CLIENT_SECRET=
GITHUB_CLIENT_ID=
GITHUB_CLIENT_SECRET=

📘 Guides

Step-by-step setup guides for running and configuring MCP Client Chatbot.

🔌 MCP Server Setup & Tool Testing

  • How to add and configure MCP servers in your environment

🐳 Docker Hosting Guide

  • How to self-host the chatbot using Docker, including environment configuration.

▲ Vercel Hosting Guide

  • Deploy the chatbot to Vercel with simple setup steps for production use.

🎯 System Prompts & Chat Customization

  • Personalize your chatbot experience with custom system prompts, user preferences, and MCP tool instructions

🔐 OAuth Sign-In Setup

  • Configure Google and GitHub OAuth for secure user login support.

Adding openAI like providers

  • Adding openAI like ai providers

💡 Tips

Advanced use cases and extra capabilities that enhance your chatbot experience.

🧠 Agentic Chatbot with Project Instructions

  • Use MCP servers and structured project instructions to build a custom assistant that helps with specific tasks.

💬 Temporary Chat Windows

  • Open lightweight popup chats for quick side questions or testing — separate from your main thread.

🗺️ Roadmap

Planned features coming soon to MCP Client Chatbot:

  • [ ] MCP-integrated LLM Workflow
  • [ ] File Attach & Image Generation
  • [ ] Collaborative Document Editing (like OpenAI Canvas: user & assistant co-editing)
  • [ ] RAG (Retrieval-Augmented Generation)
  • [ ] Web-based Compute (with WebContainers integration)

💡 If you have suggestions or need specific features, please create an issue!

🙌 Contributing

We welcome all contributions! Bug reports, feature ideas, code improvements — everything helps us build the best local AI assistant.

For detailed contribution guidelines, please see our Contributing Guide.

Language Translations: Help us make the chatbot accessible to more users by adding new language translations. See language.md for instructions on how to contribute translations.

Let’s build it together 🚀

💬 Join Our Discord

Discord

Connect with the community, ask questions, and get support on our official Discord server!

Tools

No tools

Comments