- Explore MCP Servers
- mcp-knowledgebase-llm
Mcp Knowledgebase Llm
What is Mcp Knowledgebase Llm
mcp-knowledgebase-llm is a lightweight knowledge base assistant that integrates MCP with LLM capabilities, designed to facilitate the creation of simple AI-powered knowledge assistants through a streamlined server-client architecture.
Use cases
Use cases include answering frequently asked questions, providing information on specific topics, and assisting users in navigating complex systems.
How to use
To use mcp-knowledgebase-llm, install the required dependencies with Poetry, set up your OpenAI API key in a .env file, and run the server and client scripts. The client can operate in two modes: direct tool calls or LLM-powered interactions.
Key features
Key features include a streamlined server-client architecture, integration with OpenAI for LLM capabilities, customizable tools, and a knowledge base accessible via SSE transport.
Where to use
mcp-knowledgebase-llm can be used in various fields such as customer support, educational tools, and any application that requires an AI-driven knowledge assistant.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Mcp Knowledgebase Llm
mcp-knowledgebase-llm is a lightweight knowledge base assistant that integrates MCP with LLM capabilities, designed to facilitate the creation of simple AI-powered knowledge assistants through a streamlined server-client architecture.
Use cases
Use cases include answering frequently asked questions, providing information on specific topics, and assisting users in navigating complex systems.
How to use
To use mcp-knowledgebase-llm, install the required dependencies with Poetry, set up your OpenAI API key in a .env file, and run the server and client scripts. The client can operate in two modes: direct tool calls or LLM-powered interactions.
Key features
Key features include a streamlined server-client architecture, integration with OpenAI for LLM capabilities, customizable tools, and a knowledge base accessible via SSE transport.
Where to use
mcp-knowledgebase-llm can be used in various fields such as customer support, educational tools, and any application that requires an AI-driven knowledge assistant.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
MCP Knowledge Base
A simple MCP client-server
Requirements
- Python 3.9 or higher
- Poetry for dependency management
- OpenAI API key
Setup
- Install dependencies using Poetry:
poetry install
2Create a .env file in the project root or parent directory with your OpenAI API key:
OPENAI_API_KEY=your_api_key_here
Project Structure
server.py: MCP server implementation with toolsclient-sse.py: MCP client implementation with LLM capabilitiesdata/kb.json: Knowledge base data with MCP-related Q&Apyproject.toml: Poetry configuration file
Running the Application
-
Start the server:
poetry run python server.py -
In a separate terminal, run the client:
poetry run python client-sse.py
Using the Client
The client has two modes:
-
Direct tool calls:
- Uncomment the
asyncio.run(test_direct_tool_calls())line inclient-sse.py - This directly calls the tools without using an LLM
- Uncomment the
-
LLM-powered interactions (default):
- Uses OpenAI to interpret queries and call appropriate tools
- Ask questions like “What is MCP?” or “What is the difference between stdio and SSE transports?”
Customizing
- Add new tools to
server.pyby creating additional functions with the@mcp.tool()decorator - Modify the knowledge base by updating
data/kb.json - Change the OpenAI model by modifying the
modelparameter in theMCPClientclass
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










