- Explore MCP Servers
- mcp-memory-bank
Mcp Memory Bank
What is Mcp Memory Bank
mcp-memory-bank is a powerful, production-ready context management system designed for Large Language Models (LLMs). It utilizes ChromaDB and modern embedding technologies to provide persistent, project-specific memory capabilities that enhance AI understanding and response quality.
Use cases
Use cases for mcp-memory-bank include developing intelligent chatbots that remember user interactions, creating personalized AI assistants that adapt to user preferences, and enhancing search functionalities in applications by providing context-aware results.
How to use
To use mcp-memory-bank, ensure you have the prerequisites such as Node.js, npm, Docker Desktop, and sufficient RAM and disk space. You can set it up quickly by cloning the repository, installing dependencies, and running it in development mode using Docker.
Key features
Key features include high-performance vector storage with ChromaDB, project isolation for different contexts, smart search capabilities (both semantic and keyword-based), real-time updates with dynamic content management, precise recall through advanced embedding generation, and easy deployment options with Docker.
Where to use
mcp-memory-bank can be used in various fields such as AI development, natural language processing, chatbots, and any application requiring enhanced context management for LLMs.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Mcp Memory Bank
mcp-memory-bank is a powerful, production-ready context management system designed for Large Language Models (LLMs). It utilizes ChromaDB and modern embedding technologies to provide persistent, project-specific memory capabilities that enhance AI understanding and response quality.
Use cases
Use cases for mcp-memory-bank include developing intelligent chatbots that remember user interactions, creating personalized AI assistants that adapt to user preferences, and enhancing search functionalities in applications by providing context-aware results.
How to use
To use mcp-memory-bank, ensure you have the prerequisites such as Node.js, npm, Docker Desktop, and sufficient RAM and disk space. You can set it up quickly by cloning the repository, installing dependencies, and running it in development mode using Docker.
Key features
Key features include high-performance vector storage with ChromaDB, project isolation for different contexts, smart search capabilities (both semantic and keyword-based), real-time updates with dynamic content management, precise recall through advanced embedding generation, and easy deployment options with Docker.
Where to use
mcp-memory-bank can be used in various fields such as AI development, natural language processing, chatbots, and any application requiring enhanced context management for LLMs.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
🐳 Running with Docker
This project is fully Docker-ready for easy deployment and local development. The provided Dockerfile and docker-compose.yml set up both the main application and its required ChromaDB vector database.
Requirements
- Docker (latest stable)
- Docker Compose (v2+ recommended)
Environment Variables
The following environment variables are used by default (can be overridden in your environment or via docker-compose.yml):
CHROMADB_URL=http://chromadb:8000 TRANSPORT=http HTTP_PORT=3000 MCP_MEMBANK_EMBEDDING_MODEL=Xenova/all-MiniLM-L6-v2 NODE_ENV=production NODE_OPTIONS=--max-old-space-size=4096
Build & Run
To build and start all services:
docker-compose up --build -d
This will:
- Build the main TypeScript application (Node.js v22.13.1-slim)
- Start the app as
ts-app(listening on port 3000) - Start ChromaDB as
chromadb(listening on port 8000) - Create a persistent volume for ChromaDB data
- Set up a shared Docker network for inter-service communication
Ports
- 3000: Main application HTTP API (
ts-app) - 8000: ChromaDB vector database (
chromadb)
Data Persistence
- ChromaDB data is persisted in the named Docker volume
chromadb-data. - Application data directory (
/app/data) is created and owned by a non-root user inside the container.
Special Notes
- The application requires ChromaDB to be available at the URL specified by
CHROMADB_URL(default:http://chromadb:8000). - The embedding model can be changed via the
MCP_MEMBANK_EMBEDDING_MODELenvironment variable. - If you need to customize environment variables, edit the
docker-compose.ymlor use an.envfile.
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










