MCP ExplorerExplorer

Mcp Knowledgebase Llm

@gmogmzGithubon 9 months ago
1 MIT
FreeCommunity
AI Systems
A lightweight knowledge base assistant using MCP with LLM integration. Features a streamlined server-client architecture combining custom tools with a knowledge base, all accessible via SSE transport. Ideal for building simple AI-powered knowledge assistants.

Overview

What is Mcp Knowledgebase Llm

mcp-knowledgebase-llm is a lightweight knowledge base assistant that integrates MCP with LLM capabilities, designed to facilitate the creation of simple AI-powered knowledge assistants through a streamlined server-client architecture.

Use cases

Use cases include answering frequently asked questions, providing information on specific topics, and assisting users in navigating complex systems.

How to use

To use mcp-knowledgebase-llm, install the required dependencies with Poetry, set up your OpenAI API key in a .env file, and run the server and client scripts. The client can operate in two modes: direct tool calls or LLM-powered interactions.

Key features

Key features include a streamlined server-client architecture, integration with OpenAI for LLM capabilities, customizable tools, and a knowledge base accessible via SSE transport.

Where to use

mcp-knowledgebase-llm can be used in various fields such as customer support, educational tools, and any application that requires an AI-driven knowledge assistant.

Content

MCP Knowledge Base

A simple MCP client-server

Requirements

  • Python 3.9 or higher
  • Poetry for dependency management
  • OpenAI API key

Setup

  1. Install dependencies using Poetry:
    poetry install
    

2Create a .env file in the project root or parent directory with your OpenAI API key:

OPENAI_API_KEY=your_api_key_here

Project Structure

  • server.py: MCP server implementation with tools
  • client-sse.py: MCP client implementation with LLM capabilities
  • data/kb.json: Knowledge base data with MCP-related Q&A
  • pyproject.toml: Poetry configuration file

Running the Application

  1. Start the server:

    poetry run python server.py
    
  2. In a separate terminal, run the client:

    poetry run python client-sse.py
    

Using the Client

The client has two modes:

  1. Direct tool calls:

    • Uncomment the asyncio.run(test_direct_tool_calls()) line in client-sse.py
    • This directly calls the tools without using an LLM
  2. LLM-powered interactions (default):

    • Uses OpenAI to interpret queries and call appropriate tools
    • Ask questions like “What is MCP?” or “What is the difference between stdio and SSE transports?”

Customizing

  • Add new tools to server.py by creating additional functions with the @mcp.tool() decorator
  • Modify the knowledge base by updating data/kb.json
  • Change the OpenAI model by modifying the model parameter in the MCPClient class

Tools

No tools

Comments

Recommend MCP Servers

View All MCP Servers