Your AI(Claude Opus, Codex 5.4) sees 5% of your codebase and hallucinates the rest. Entroly fixes this — 95% fewer tokens, 100% code visibility. Works with Cursor, Claude Code, Copilot.
# Add to your Claude Code skills
git clone https://github.com/juyterman1000/entrolyEvery AI coding tool — Cursor, Claude Code, GitHub Copilot, Windsurf, Cody — has the same fatal flaw:
Your AI can only see 5-10 files at a time. The other 95% of your codebase is invisible.
This causes:
auth.py without knowing about auth_config.pyYou've felt this. You paste code manually. You write long system prompts. You pray it doesn't hallucinate. There's a better way.
Entroly compresses your entire codebase into the context window at variable resolution.
| What changes | Before Entroly | After Entroly | |---|---|---| | Files visible to AI | 5-10 files | All files (variable resolution) | | Tokens per request | 186,000 (raw dump) | 9,300 - 55,000 (70-95% reduction) | | Cost per 1K requests | ~$560 | $28 - $168 | | AI answer quality | Incomplete, hallucinated | Correct, dependency-aware | | Setup time | Hours of prompt engineering | 30 seconds | | | N/A | |
No comments yet. Be the first to share your thoughts!
Critical files appear in full. Supporting files appear as signatures. Everything else appears as references. Your AI sees the whole picture — and you pay 70-95% less.
| | RAG (vector search) | Entroly (context engineering) | |--|---|---| | What it sends | Top-K similar chunks | Entire codebase at optimal resolution | | Handles duplicates | No — sends same code 3x | SimHash dedup in O(1) | | Dependency-aware | No | Yes — auto-includes related files | | Learns from usage | No | Yes — RL optimizes from AI response quality | | Needs embeddings API | Yes (extra cost + latency) | No — runs locally | | Optimal selection | Approximate | Mathematically proven (knapsack solver) |
pip install entroly && entroly demo # see savings on YOUR codebase
Open the interactive demo for the animated experience.
pip install entroly[full]
entroly go
That's it. entroly go auto-detects your IDE, configures everything, starts the proxy and dashboard. Then point your AI tool to http://localhost:9377/v1.
pip install entroly # core engine
entroly init # detect IDE + generate config
entroly proxy --quality balanced # start proxy
npm install entroly
docker pull ghcr.io/juyterman1000/entroly:latest
docker run --rm -p 9377:9377 -p 9378:9378 -v .:/workspace:ro ghcr.io/juyterman1000/entroly:latest
| Package | What you get |
|---------|---|
| pip install entroly | Core — MCP server + Python engine |
| pip install entroly[proxy] | + HTTP proxy mode |
| pip install entroly[native] | + Rust engine (50-100x faster) |
| pip install entroly[full] | Everything |
| AI Tool | Setup | Method |
|---------|-------|--------|
| Cursor | entroly init | MCP server |
| Claude Code | claude mcp add entroly -- entroly | MCP server |
| VS Code + Copilot | entroly init | MCP server |
| Windsurf | entroly init | MCP server |
| Cline | entroly init | MCP server |
| OpenClaw | See below | Context Engine |
| Cody | entroly proxy | HTTP proxy |
| Any LLM API | entroly proxy | HTTP proxy |
"I stopped manually pasting code into Claude. Entroly just works."
entroly go handles everything. No YAML, no embeddings, no prompt engineering.OpenClaw users get the deepest integration — Entroly plugs in as a Context Engine:
| Agent Type | What Entroly Does | Token Savings | |------------|---|---| | Main agent | Full codebase at variable resolution | ~95% | | Heartbeat | Only loads changes since last check | ~90% | | Subagents | Inherited context + Nash bargaining budget split | ~92% | | Cron jobs | Minimal context — relevant memories + schedule | ~93% | | Group chat | Entropy-filtered messages — only high-signal kept | ~90% |
from entroly.context_bridge import MultiAgentContext
ctx = MultiAgentContext(workspace_path="~/.openclaw/workspace")
ctx.ingest_workspace()
sub = ctx.spawn_subagent("main", "researcher", "find auth bugs")
| Stage | What | Result | |---|---|---| | 1. Ingest | Index codebase, build dependency graph, fingerprint fragments | Complete map in <2s | | 2. Score | Rank by information density — high-value code up, boilerplate down | Every fragment scored | | 3. Select | Mathematically optimal subset fitting your token budget | Proven optimal (knapsack) | | 4. Deliver | 3 resolution levels: full → signatures → references | 100% coverage | | 5. Learn | Track which context produced good AI responses | Gets smarter over time |
"The LLM is the CPU, the context window is RAM."
| Layer | What it solves | |---|---| | Documentation tools | Give your agent up-to-date API docs | | Memory systems | Remember things across conversations | | RAG / retrieval | Find relevant code chunks | | Entroly (optimization) | Makes everything fit — optimally compresses codebase + docs + memory into the token budget |
These layers are complementary. Entroly is the optimization layer that ensures everything fits without waste.
| Command | What it does |
|---------|---|
| entroly go | One command — auto-detect, init, proxy, dashboard |
| entroly demo | Before/after comparison with dollar savings on YOUR project |
| entroly dashboard | Live metrics: savings trends, health grade, PRISM weights |
| entroly doctor | 7 diagnostic checks — finds problems before you do |
| entroly health | Codebase health grade (A-F): clones, dead code, god files |
| entroly benchmark | Competitive benchmark: Entroly vs raw context vs top-K |
| entroly role | Weight presets: frontend, backend, sre, data, fullstack |
| entroly autotune | Auto-optimize engine parameters |
| entroly digest | Weekly summary: tokens saved, cost reduction |
| entroly status | Check running services |
entroly proxy --quality speed # minimal optimization, lowest latency
entroly proxy --quality balanced # recommended (default)
entroly proxy --quality max # full pipeline, best results
entroly proxy --quality 0.7 # any float 0.0-1.0
| | Linux | macOS | Windows | |--|---|---|---| | Python 3.10+ | Yes | Yes | Yes | | Rust wheel | Yes | Yes (Intel + Apple Silicon) | Yes | | Docker | Optional | Optional | Optional | | Admin/WSL required | No | No | No |
~/.entroly/value_tracker.json, trend charts in dashboard/confidence end