- Explore MCP Servers
- memcp
Memcp
What is Memcp
MemCP is an extensible Memory Context Protocol (MCP) server designed for AI coding agents. It enhances the Zep AI Graphiti MCP server, enabling AI agents to create, maintain, and query a temporal knowledge graph of facts, entities, and relationships.
Use cases
Use cases for MemCP include enhancing coding assistants in IDEs, automating the management of coding-related knowledge, facilitating collaborative coding environments, and supporting AI-driven development tools that require dynamic knowledge updates.
How to use
To use MemCP, clone the repository from GitHub, ensure you have the required prerequisites (Python 3.10+, Neo4j, OpenAI API key), and configure it using environment variables, TOML files, or CLI arguments. It can be integrated with any IDE or LLM client that supports MCPs via local SSE or stdio connections.
Key features
Key features of MemCP include a temporal knowledge graph for dynamic information management, MCP integration for compatibility with various clients, automatic entity and relationship extraction, flexible configuration options, persistence through Neo4j storage, and support for multiple transport methods (SSE and stdio).
Where to use
MemCP can be used in software development environments, particularly within Integrated Development Environments (IDEs) and applications that leverage large language models (LLMs) for coding assistance and knowledge management.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Memcp
MemCP is an extensible Memory Context Protocol (MCP) server designed for AI coding agents. It enhances the Zep AI Graphiti MCP server, enabling AI agents to create, maintain, and query a temporal knowledge graph of facts, entities, and relationships.
Use cases
Use cases for MemCP include enhancing coding assistants in IDEs, automating the management of coding-related knowledge, facilitating collaborative coding environments, and supporting AI-driven development tools that require dynamic knowledge updates.
How to use
To use MemCP, clone the repository from GitHub, ensure you have the required prerequisites (Python 3.10+, Neo4j, OpenAI API key), and configure it using environment variables, TOML files, or CLI arguments. It can be integrated with any IDE or LLM client that supports MCPs via local SSE or stdio connections.
Key features
Key features of MemCP include a temporal knowledge graph for dynamic information management, MCP integration for compatibility with various clients, automatic entity and relationship extraction, flexible configuration options, persistence through Neo4j storage, and support for multiple transport methods (SSE and stdio).
Where to use
MemCP can be used in software development environments, particularly within Integrated Development Environments (IDEs) and applications that leverage large language models (LLMs) for coding assistance and knowledge management.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
MemCP - Memory Context Protocol for AI Agents
MemCP is an extensible memory MCP server for AI coding agents. It modularizes and extends the Zep AI Graphiti mcp-server example, allowing AI agents to build, maintain, and query a temporal knowledge graph of facts, entities, and relationships.
MemCP is currently designed as a plug and play memory server for any IDE or LLM client app that supports MCPs through local sse or stdio connections.
Note: It has largely been tested with the Cursor IDE compared to other MCP capable clients.
Features
- Temporal Knowledge Graph: Lets your AI agents automatically build and query a temporal knowledge graph that evolves as new information is added.
- MCP Integration: Works with any MCP-compatible clients.
- Entity Extraction: Automatic entity and relationship extraction with coding specific entity types. Customization is on the roadmap.
- Flexible Configuration: Simple configuration via environment variables, TOML files, or CLI arguments
- Persistence: Stores knowledge in Neo4j for persistence across sessions
- Multiple Transports: Supports both SSE (HTTP) and stdio transports for integration with different clients
Documentation
Comprehensive documentation for MemCP can be found in the docs directory. See the docs README for an overview of available documentation.
Installation
Prerequisites
- Python 3.10 or higher
- Neo4j database (version 5.26 or later)
Note: you can run it with docker if you do not want to install Neo4j locally.
- OpenAI API key (required for embeddings)
- Anthropic API key (optional, for Claude models)
Clone the repository
git clone https://github.com/evanmschultz/memcp.git
cd memcp
Using UV (Recommended)
UV is a fast package manager for Python written in Rust. Follow their docs for installation instructions if you don’t have it already.
# Install MemCP
uv sync
Using pip
pip install memcp
Optional Dependencies
To use Anthropic models (Claude) instead of OpenAI:
uv sync "memcp[anthropic]"
# or
pip install "memcp[anthropic]"
Quick Start
Starting the MemCP Server
- Copy the example environment file and configure your settings:
# Copy the example environment file
cp .env.example .env
# Edit the .env file with your settings
# Replace the following values in .env:
# - NEO4J_PASSWORD
# - OPENAI_API_KEY
# - ANTHROPIC_API_KEY (if using Anthropic)
- Start the MemCP server with default settings:
# Start MemCP server with SSE transport (default)
memcp
The .env file will be automatically loaded when you start the server. Make sure to keep your .env file secure and never commit it to version control.
Note: You can also simply export the environment variables in your shell before starting the server.
Configuration
MemCP can be configured in several ways (in order of precedence):
- Command Line Arguments (highest priority)
- Configuration File (config.toml)
- Default Values (lowest priority)
Note: Use the
config.tomlfile to set configs that you wish to easily persist between sessions. It can be found in the memcp directory, not the root directory. Custom config.toml paths are on the roadmap; this would allow you to have multiple configs for multiple different clients, coding sessions, etc.
Command Line Arguments
# Using various CLI options
memcp --graph.id=my-memory-graph --server.transport=stdio --llm.provider=anthropic
# Help
memcp --help
Integration with MCP Clients
Cursor IDE Configuration
Add the following to your Cursor plugin configuration:
{
"mcpServers": {
"MemCP": {
"transport": "sse",
"url": "http://localhost:8000/sse"
}
}
}
Available Tools
MemCP exposes these MCP tools to your LLM:
- add_episode: Add an episode to the knowledge graph (text, JSON, messages)
- search_nodes: Search for entity nodes in the graph
- search_facts: Search for relationships between entities
- delete_entity_edge: Delete a relationship between entities
- delete_episode: Delete an episode from the knowledge graph
- get_entity_edge: Get details about a specific relationship
- get_episodes: Retrieve recent episodes
- clear_graph: Reset the knowledge graph (use with caution)
Entity Types
MemCP defines these default entity types:
- Preference: User preferences and likes/dislikes
- Procedure: Steps or actions to perform in certain scenarios
- Requirement: Features or functionalities a product or service must fulfill
You can enable these entity types with the --graph.use_memcp_entities flag.
Note: You can also disable the MemCP entity types and use Graphiti’s default entity types by setting the
--graph.use_memcp_entitiesflag tofalse.
Docker Deployment
For containerized deployment:
# Build and run with docker-compose
docker compose up
Contributing
Contributions are welcome! Please see our CONTRIBUTING.md for guidelines.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Final Note
Note: MemCP tends to make a large amount of api calls, as does Graphiti which it relies on. This is necessary for accurate and useful graph generation. It will however incur a large amount of costs. Just be aware of this when using MemCP.
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










