- Explore MCP Servers
- llm-mcp-rag-js
Llm Mcp Rag Js
What is Llm Mcp Rag Js
llm-mcp-rag-js is a TypeScript-based framework that enhances Retrieval-Augmented Generation (RAG) capabilities, functioning like an intelligent librarian that retrieves and augments knowledge while maintaining strong type safety.
Use cases
Use cases include building chatbots that provide contextually relevant answers, enhancing search engines with intelligent document retrieval, and creating applications that require dynamic knowledge augmentation.
How to use
To use llm-mcp-rag-js, install the dependencies using ‘pnpm install’, configure the environment variables in a .env file, and then run the service in development mode with ‘pnpm dev’. You can initialize the MCP service, load documents, and create a RAG agent to process queries.
Key features
Key features include modular architecture, document retrieval capabilities, integration with MCP protocols, error handling, and support for OpenAI API. It also offers a retry mechanism and context-augmented response generation.
Where to use
llm-mcp-rag-js can be utilized in various fields such as natural language processing, AI-driven applications, knowledge management systems, and any domain requiring intelligent document retrieval and response generation.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Llm Mcp Rag Js
llm-mcp-rag-js is a TypeScript-based framework that enhances Retrieval-Augmented Generation (RAG) capabilities, functioning like an intelligent librarian that retrieves and augments knowledge while maintaining strong type safety.
Use cases
Use cases include building chatbots that provide contextually relevant answers, enhancing search engines with intelligent document retrieval, and creating applications that require dynamic knowledge augmentation.
How to use
To use llm-mcp-rag-js, install the dependencies using ‘pnpm install’, configure the environment variables in a .env file, and then run the service in development mode with ‘pnpm dev’. You can initialize the MCP service, load documents, and create a RAG agent to process queries.
Key features
Key features include modular architecture, document retrieval capabilities, integration with MCP protocols, error handling, and support for OpenAI API. It also offers a retry mechanism and context-augmented response generation.
Where to use
llm-mcp-rag-js can be utilized in various fields such as natural language processing, AI-driven applications, knowledge management systems, and any domain requiring intelligent document retrieval and response generation.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
LLM-MCP-RAG-JS
提示: 如果需要 Python 版本的实现,请访问 LLM-MCP-RAG-Python。
Python版本具有更好的扩展性,可以更好的集成rag/Vector DB/ComfyUI
项目架构
项目采用模块化设计,主要包含以下核心组件:
目录结构
src/ ├── core/ # 核心业务逻辑 ├── protocol/ # MCP 协议实现 ├── retrieval/ # 文档检索相关实现 ├── utils/ # 工具函数 └── config/ # 配置文件
核心组件
-
MCP Service (
src/protocol/MCPService.ts)- 负责与 MCP 服务器的通信
- 管理工具注册和调用
- 实现了重试机制和错误处理
-
Document Retrieval (
src/retrieval/)- 实现文档检索功能
- 支持文档的加载和向量化
- 提供相似度搜索
-
RAG Agent (
src/core/)- 协调 MCP 服务和文档检索
- 处理用户查询
- 生成上下文增强的响应
技术栈
- 运行时环境: Node.js
- 开发语言: TypeScript
- 包管理器: pnpm
- 主要依赖:
@modelcontextprotocol/sdk: MCP 协议实现openai: OpenAI API 客户端zod: 类型验证dotenv: 环境变量管理
配置和运行
环境要求
- Node.js >= 18
- pnpm >= 10.6.3
安装依赖
pnpm install
环境变量配置
创建 .env 文件并配置以下环境变量:
EMBEDDING_BASE_URL=https://oneapi.biubiuniu.com EMBEDDING_KEY=your_embedding_api_key OPENAI_BASE_URL=https://oneapi.biubiuniu.com OPENAI_API_KEY=your_llm_api_key
开发运行
# 开发模式
pnpm dev
# 构建
pnpm build
# 生产运行
pnpm start
待改进方向
-
错误处理优化
- 实现更细粒度的错误类型
- 添加错误重试策略配置
- 改进错误日志记录
-
性能优化
- 实现文档缓存机制
- 优化向量检索性能
- 添加批处理支持
-
可观测性
- 添加详细的日志记录
- 实现性能指标收集
- 添加监控接口
使用示例
import { MCPService } from './protocol/MCPService';
import { DocumentRetriever } from './retrieval/DocumentRetriever';
import { RAGAgent } from './core/RAGAgent';
// 初始化服务
const mcpService = new MCPService('my-service', 'mcp-server');
await mcpService.initialize();
// 初始化文档检索
const retriever = new DocumentRetriever();
await retriever.loadDocuments('./knowledge');
// 创建 RAG Agent
const agent = new RAGAgent(mcpService, retriever);
// 处理查询
const response = await agent.process('你的问题');
本项目部分代码参考了 KelvinQiu802/llm-mcp-rag,在此表示感谢。# LLM-MCP-RAG
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










