by Lyellr88
Turn AI into a persistent, memory-powered collaborator. Universal MCP Server (supports HTTP, STDIO, and WebSocket) enabling cross-platform AI memory, multi-agent coordination, and context sharing. Built with MARM protocol for structured reasoning that evolves with your work.
# Add to your Claude Code skills
git clone https://github.com/Lyellr88/MARM-SystemsMemory Accurate Response Mode v2.2.6 - The intelligent persistent memory system for AI agents (supports HTTP and STDIO), stop fighting your memory and control it. Experience long-term recall, session continuity, and reliable conversation history, so your LLMs never lose track of what matters.
Note: This is the official MARM repository. All official versions and releases are managed here.
No comments yet. Be the first to share your thoughts!
Forks may experiment, but official updates will always come from this repo.
I am currently finalizing the launch of SysDX, a forensic-grade Windows hardware diagnostics toolkit designed to catch intermittent PCIe flapping and TDR crashes through 800+ automated Pester tests. This production-heavy development phase has served as the ultimate stress test for MARM’s core logic, and I am now porting those real-world stability and memory optimizations back into this repository.
🛠️ Q2 Roadmap
Thank you for your patience and support.
This demo video walks through a Docker pull of MARM MCP and connecting it to Claude using the claude add mcp transport command and then shows multiple AI agents (Claude, Gemini, Qwen) instantly sharing logs and notebook entries via MARM’s persistent, universal memory proving seamless cross-agent recall and “absolute truth” notebooks in action.
Your AI forgets everything. MARM MCP doesn't.
Modern LLMs lose context over time, repeat prior ideas, and drift off requirements. MARM MCP solves this with a unified, persistent, MCP‑native memory layer that sits beneath any AI client you use. It blends semantic search, structured session logs, reusable notebooks, and smart summaries so your agents can remember, reference, and build on prior work—consistently, across sessions, and across tools.
MCP in One Sentence: MARM MCP provides persistent memory and structured session context beneath any AI tool, so your agents learn, remember, and collaborate across all your workflows.
| Memory | Multi-AI | Architecture | |------------|--------------|------------------| | Semantic Search - Find by meaning using AI embeddings | Unified Memory Layer - Works with Claude, Qwen, Gemini, MCP clients | 18 Complete MCP Tools - Full Model Context Protocol coverage | | Auto-Classification - Content categorized (code, project, book, general) | Cross-Platform Intelligence - Different AIs learn from shared knowledge | Database Optimization - SQLite with WAL mode and connection pooling | | Persistent Cross-Session Memory - Memories survive across agent conversations | User-Controlled Memory - "Bring Your Own History," granular control | Rate Limiting - IP-based tiers for stability | | Smart Recall - Vector similarity search with context-aware fallbacks | | MCP Compliance - Response size management for predictable performance | | | | Docker Ready - Containerized deployment with health/readiness checks |
“MARM successfully handles our industrial automation workflows in production. We've validated session management, persistent logging, and smart recall across container restarts in our Windows 11 + Docker environment. The system reliably tracks complex technical decisions and maintains data integrity through deployment cycles.”
@Ophy21, GitHub user (Industrial Automation Engineer)
“MARM proved exceptionally valuable for DevOps and complex Docker projects. It maintained 100% memory accuracy, preserved context on 46 services and network configurations, and enabled standards-compliant Python/Terraform work. Semantic search and automated session logs made solving async and infrastructure issues far easier. Value Rating: 9.5/10 - indispensable for enterprise-grade memory, technical standards, and long-session code management.” @joe_nyc, Discord user (DevOps/Infrastructure Engineer)
Now that you understand the ecosystem, here's info and how to use the MCP server with your AI agents
Docker Install:
docker pull lyellr88/marm-mcp-server:latest
docker run -d --name marm-mcp-server -p 8001:8001 -v ~/.marm:/home/marm/.marm lyellr88/marm-mcp-server:latest
claude mcp add --transport http marm-memory http://localhost:8001/mcp
Local http Install:
pip install marm-mcp-server==2.2.6
pip install -r marm-mcp-server/requirements.txt
python marm-mcp-server
claude mcp add --transport http marm-memory http://localhost:8001/mcp
Stdio Install:
pip install marm-mcp-server==2.2.6
pip install -r marm-mcp-server/requirements_stdio.txt
<platform> mcp add --transport stdio marm-memory-stdio python "your/file/path/to/marm-mcp-server/server_stdio.py"
python marm-mcp-server/server_stdio.py
Docker (Fastest - 30 seconds):
docker pull lyellr88/marm-mcp-server:latest
docker run -d --name marm-mcp-server -p 8001:8001 -v ~/.marm:/home/marm/.marm lyellr88/marm-mcp-server:latest
claude mcp add --transport http marm-memory http://localhost:8001/mcp
Quick Local http Install:
pip install marm-mcp-server==2.2.6
pip install -r marm-mcp-server/requirements.txt
python marm-mcp-server
claude mcp add --transport http marm-memory http://localhost:8001/mcp
Http Manual JSON Configuration:
{
"mcpServers": {
"marm-memory": {
"httpUrl": "http://localhost:8001/mcp",
"authentication": {
"type": "oauth",
"clientId": "local_client_b6f3a01e",
"clientSecret": "local_secret_ad6703cd2b4243ab",
"authorizationUrl": "http://localhost:8001/oauth/authorize",
"tokenUrl": "http://localhost:8001/oauth/token"
}
}
}
}
MARM includes mock OAuth 2.0 credentials for local testing—not a production authentication system.
Why hardcoded credentials? When developing locally, you don't have external OAuth providers (GitHub, Google, etc.). MARM includes dev credentials so you can test the full MCP authentication flow without external dependencies.
For local development, use these credentials:
local_client_b6f3a01elocal_secret_ad6703cd2b4243abThe server validates against these hardcoded values only during development.
For production deployment: Replace this entire section with real OAuth 2.1 authentication. These hardcod