- Explore MCP Servers
- memory-server
Memory Server
What is Memory Server
Memory Server is a semantic memory storage and retrieval system designed to work with the Model Context Protocol (MCP). It utilizes Quarkus for performance, Kotlin for type safety, and Qdrant for efficient vector-based semantic search to provide high-speed memory management.
Use cases
Memory Server can be utilized in various AI applications where semantic understanding of information is necessary. It is particularly useful for agents that require memory capabilities for tasks such as question answering, recommendations, and contextual assistance, facilitating effective information retrieval based on meaning rather than keywords.
How to use
To begin using Memory Server, you can run it locally with Docker Compose. For memory operations, the command-line interface (CLI) can be employed to store (‘remember’) and fetch (‘recall’) memories based on semantic content. Integration with AI agents like Goose is achieved by using the memory-proxy extension to facilitate communication via MCP calls.
Key features
Key features include semantic vector memory for nuanced recall, tag-based filtering for organizing memories, compatibility with MCP for integrating with AI agents, and providing both a CLI for human operators and an API for agent interaction, supporting flexible and scalable memory operations.
Where to use
Memory Server is suitable for deployments in AI-driven environments that demand advanced semantic memory capabilities. It can be applied in chatbots, virtual assistants, customer support systems, and any application involving natural language processing where context awareness and memory are essential for enhanced user interactions.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Memory Server
Memory Server is a semantic memory storage and retrieval system designed to work with the Model Context Protocol (MCP). It utilizes Quarkus for performance, Kotlin for type safety, and Qdrant for efficient vector-based semantic search to provide high-speed memory management.
Use cases
Memory Server can be utilized in various AI applications where semantic understanding of information is necessary. It is particularly useful for agents that require memory capabilities for tasks such as question answering, recommendations, and contextual assistance, facilitating effective information retrieval based on meaning rather than keywords.
How to use
To begin using Memory Server, you can run it locally with Docker Compose. For memory operations, the command-line interface (CLI) can be employed to store (‘remember’) and fetch (‘recall’) memories based on semantic content. Integration with AI agents like Goose is achieved by using the memory-proxy extension to facilitate communication via MCP calls.
Key features
Key features include semantic vector memory for nuanced recall, tag-based filtering for organizing memories, compatibility with MCP for integrating with AI agents, and providing both a CLI for human operators and an API for agent interaction, supporting flexible and scalable memory operations.
Where to use
Memory Server is suitable for deployments in AI-driven environments that demand advanced semantic memory capabilities. It can be applied in chatbots, virtual assistants, customer support systems, and any application involving natural language processing where context awareness and memory are essential for enhanced user interactions.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
Memory Server
Qdrant × Quarkus MCP Memory Server
High-performance semantic memory orchestration for agents and daemons.
Overview
Memory Server provides a powerful semantic memory storage and retrieval system tailored specifically for use with the Model Context Protocol (MCP). It leverages the speed of native Quarkus applications, Kotlin for type-safety, and Qdrant for scalable vector-based semantic search.
Features
- Semantic Vector Memory: Store and recall memories based on semantic similarity rather than exact matches.
- Tag-based Filtering: Easily organize and filter memories using flexible tagging.
- Dual CLI and Agent Interface: Designed for both human operators (
memory-cli) and AI agents (memory-proxy). - MCP Compatible: Uses standard MCP calls for seamless integration with AI agents like Goose.
Quickstart
Local Development
To run Memory Server locally using Docker Compose:
docker-compose up
CLI Usage
To use the command-line client for memory operations:
./memory-cli remember --content "This is a test memory" --tags "qa:true,env:dev"
./memory-cli recall --content "test memory"
Agent (Goose) Integration
Load the memory-server extension via Goose CLI with STDIO proxy:
/extension memory-proxy
Example MCP calls via Goose:
{
"request": {
"memory": {
"content": "Semantic search test"
},
"page": 1,
"pageSize": 10
}
}
Architecture
memory-cli ↔ memory-proxy ↔ REST (memory-server) ↔ gRPC ↔ Qdrant
memory-server: REST endpoint, semantic embedding generation, gRPC client to Qdrant.memory-cli: User/operator CLI.memory-proxy: STDIO proxy enabling MCP tool calls from AI agents like Goose.
Recipes & Usage
- QA Recipes available in
recipes/. - Example configuration and recipes provided for immediate integration with Goose CLI.
Contributing
Contributions are welcome! Please open an issue or pull request on GitHub.
License
This project is licensed under the APACHE 2.0 License – see the LICENSE file for details.
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










