by varun29ankuS
Cognitive memory for AI agents — learns from use, forgets what's irrelevant, strengthens what matters. Single binary, fully offline.
# Add to your Claude Code skills
git clone https://github.com/varun29ankuS/shodh-memoryAI agents forget everything between sessions. They repeat mistakes, lose context, and treat every conversation like the first one.
Shodh-Memory fixes this. It's persistent memory that actually learns — memories you use often become easier to find, old irrelevant context fades automatically, and recalling one thing brings back related things. No API keys. No cloud. No external databases. One binary.
| | Shodh | mem0 | Cognee | Zep | |---|---|---|---|---| | LLM calls to store a memory | 0 | 2+ per add | 3+ per cognify | 2+ per episode | | External services needed | | OpenAI + vector DB | OpenAI + Neo4j + vector DB | OpenAI + Neo4j | | Time to store a memory | | ~20 seconds | seconds | seconds | | Learns from usage | (Hebbian) | No | No | No | | Forgets irrelevant data | (decay) | No | No | Temporal only | | Runs fully offline | | No | No | No | | Binary size | | pip install + API keys | pip install + API keys + Neo4j | Cloud only |
<details> <summary>Or use a binary instead of Docke...No comments yet. Be the first to share your thoughts!
Every other memory system delegates intelligence to LLM API calls — that's why they're slow, expensive, and can't work offline. Shodh uses algorithmic intelligence: local embeddings, mathematical decay, learned associations. No LLM in the loop.
# 1. Start the server
docker run -d -p 3030:3030 -v shodh-data:/data varunshodh/shodh-memory
# 2. Add to Claude Code
claude mcp add shodh-memory -- npx -y @shodh/memory-mcp
That's it. Claude now has persistent memory across sessions.