- Explore MCP Servers
- Like-I-Said-memory-mcp-server
Like I Said Memory Mcp Server
What is Like I Said Memory Mcp Server
Like-I-Said-memory-mcp-server is a powerful Model Context Protocol (MCP) server designed to provide persistent memory capabilities for AI assistants such as Claude Desktop, Cursor, and Windsurf, featuring an elegant web dashboard for efficient memory management.
Use cases
Use cases include managing user interactions and preferences in AI assistants, storing context for personalized responses, and facilitating data retrieval and updates for AI-driven applications.
How to use
To use Like-I-Said-memory-mcp-server, install it via the one-click installer, ensuring that the installation path does not contain spaces. Once installed, you can manage memories through the modern web dashboard, allowing for full CRUD operations.
Key features
Key features include persistent memory storage, context-aware storage with rich metadata, multi-client support, a JSON-based storage format, a modern React web dashboard, real-time statistics, advanced search and filtering, tag-based organization, and mobile responsiveness.
Where to use
Like-I-Said-memory-mcp-server can be used in various fields such as AI development, personal assistant applications, and any scenario requiring efficient memory management for AI systems.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Like I Said Memory Mcp Server
Like-I-Said-memory-mcp-server is a powerful Model Context Protocol (MCP) server designed to provide persistent memory capabilities for AI assistants such as Claude Desktop, Cursor, and Windsurf, featuring an elegant web dashboard for efficient memory management.
Use cases
Use cases include managing user interactions and preferences in AI assistants, storing context for personalized responses, and facilitating data retrieval and updates for AI-driven applications.
How to use
To use Like-I-Said-memory-mcp-server, install it via the one-click installer, ensuring that the installation path does not contain spaces. Once installed, you can manage memories through the modern web dashboard, allowing for full CRUD operations.
Key features
Key features include persistent memory storage, context-aware storage with rich metadata, multi-client support, a JSON-based storage format, a modern React web dashboard, real-time statistics, advanced search and filtering, tag-based organization, and mobile responsiveness.
Where to use
Like-I-Said-memory-mcp-server can be used in various fields such as AI development, personal assistant applications, and any scenario requiring efficient memory management for AI systems.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content

Like-I-Said MCP v2
MCP memory server for AI assistants - Remember conversations across sessions
Give your AI assistants persistent memory! Store information, preferences, and context that survives conversation restarts.
✨ Features
- 🧠 Persistent Memory - AI remembers across conversations
- 🚀 One-Command Install - Auto-configures all AI clients
- 🌍 Cross-Platform - Windows, macOS, Linux (including WSL)
- 📊 React Dashboard - Modern web interface with real-time updates
- 🔧 6 Memory Tools - Complete memory management suite
- 📝 Markdown Storage - Enhanced frontmatter with categories and relationships
- 🔍 Advanced Search - Full-text search with filters and tags
- 📈 Analytics - Memory usage statistics and insights
- 🎨 Modern UI - Card-based layout with dark theme
🚀 Quick Install
Step 1: Install MCP Server
npx -p @endlessblink/like-i-said-v2 like-i-said-v2 install
The installer will:
- ✅ Auto-detect your AI clients (Claude Desktop, Cursor, Windsurf)
- ✅ Configure MCP settings automatically
- ✅ Test server functionality
- ✅ Preserve existing MCP servers
Step 2: Start the Web Dashboard (Optional)
# Global installation (recommended)
npm install -g @endlessblink/like-i-said-v2
like-i-said-v2 start
# Or run directly from npx
npx -p @endlessblink/like-i-said-v2 like-i-said-v2 start
Visit http://localhost:3001 for visual memory management with AI insights, statistics, and relationship mapping.
📸 Dashboard Screenshots
Memory Management

Modern card-based memory interface with search, filtering, and project organization
Relationship Visualization

Interactive graph visualization showing connections between memories
Analytics Dashboard

Comprehensive statistics and insights about your memory usage
Enhanced Features

AI-powered memory enhancement, clustering, and advanced organization
🎯 Supported AI Clients
| Client | Status | Platform |
|---|---|---|
| Claude Desktop | ✅ Full Support | Windows, macOS, Linux |
| Cursor | ✅ Full Support | Windows, macOS, Linux |
| Windsurf | ✅ Full Support | Windows, macOS, Linux |
| Claude Code (VS Code) | ✅ Full Support | Windows, macOS, Linux |
| Continue | ✅ Full Support | Windows, macOS, Linux |
| Zed Editor | ✅ Full Support | Windows, macOS, Linux |
🛠️ Available Tools
After installation, your AI assistant will have these tools:
add_memory- Store information with tags, categories, and project contextget_memory- Retrieve specific memory by IDlist_memories- Show memories with complexity levels and metadatadelete_memory- Remove specific memorysearch_memories- Full-text search with project filteringtest_tool- Verify MCP connection
Enhanced Memory Features:
- Categories: personal, work, code, research, conversations, preferences
- Complexity Levels: L1 (Simple) → L4 (Advanced)
- Projects: Organize memories by project context
- Relationships: Link related memories together
📋 Usage Examples
Store a preference:
“Remember that I prefer TypeScript over JavaScript for new projects”
Recall information:
“What did I tell you about my TypeScript preference?”
Project context:
“Store that this React app uses Tailwind CSS and shadcn/ui components”
Search memories:
“Find all memories about React projects”
🔧 Advanced Setup
Custom Installation
npx -p @endlessblink/like-i-said-v2 like-i-said-v2 init
Manual Server Start
npx -p @endlessblink/like-i-said-v2 like-i-said-v2 start
🔄 After Installation
-
Restart your AI client:
- Claude Desktop: Close completely and restart
- Cursor: Press
Ctrl+Shift+P→ “Reload Window” - Windsurf: Auto-detects changes
-
Test the installation:
“What MCP tools do you have available?”
-
Start using memory:
“Remember that I’m working on a Next.js project called MyApp”
🆘 Troubleshooting
Tools don’t appear?
- Ensure you fully restarted your AI client
- Wait 2-3 minutes for detection (Claude Desktop may take up to 5 minutes)
- Check client-specific logs
Windows-specific notes:
- ⚠️ Always use the full npx command format:
npx -p @endlessblink/like-i-said-v2 like-i-said-v2 install - The simplified
npx @endlessblink/like-i-said-v2 installwill NOT work on Windows - For PowerShell issues, try:
cmd /c "npx -p @endlessblink/like-i-said-v2 like-i-said-v2 install"
Config locations:
- Claude Desktop:
- Windows:
%APPDATA%\Claude\claude_desktop_config.json - macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Linux:
~/.config/Claude/claude_desktop_config.json
- Windows:
- Cursor:
- Windows:
%USERPROFILE%\.cursor\mcp.json - macOS/Linux:
~/.cursor/mcp.json
- Windows:
- Windsurf:
- Windows:
%USERPROFILE%\.codeium\windsurf\mcp_config.json - macOS/Linux:
~/.codeium/windsurf/mcp_config.json
- Windows:
Reset installation:
npx -p @endlessblink/like-i-said-v2 like-i-said-v2 install
🔨 Development Setup
If you want to run from source:
# Clone the repository
git clone https://github.com/endlessblink/like-i-said-mcp-server.git
cd like-i-said-mcp-server
# Install dependencies
npm install
# Run development servers
npm run dev:full # Start both API and React dashboard
npm run dev # React dashboard only
npm run dashboard # API server only
# Build for production
npm run build
📊 Memory Storage
- Format: Markdown files with enhanced frontmatter
- Location:
memories/directory organized by project - Structure: 145+ memories with complexity levels, categories, and relationships
- Features: Real-time file watching, automatic indexing
- API: RESTful API on port 3001 for dashboard integration
🤝 Contributing
Found a bug or want to contribute?
- Issues: GitHub Issues
- Repository: GitHub
📜 License
MIT License - see LICENSE file for details.
Made for AI enthusiasts who want their assistants to remember! 🧠✨
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










