by qhkm
AI coding agent — 32 tools, 9 providers, 9 channels. The agent runtime that ZeptoRT supervises.
# Add to your Claude Code skills
git clone https://github.com/qhkm/zeptoclaw$ zeptoclaw agent --stream -m "Analyze our API for security issues"
🤖 ZeptoClaw — Streaming analysis...
[web_fetch] Fetching API docs...
[shell] Running integration tests...
[longterm_memory] Storing findings...
→ Found 12 endpoints, 3 missing auth headers, 1 open redirect
→ Saved findings to long-term memory under "api-audit"
✓ Analysis complete in 4.2s
We studied the best AI assistants — and their tradeoffs. OpenClaw's integrations without the 100MB. NanoClaw's security without the TypeScript bundle. NemoClaw's governance without the 2GB Docker container. PicoClaw's size without the bare-bones feature set. One Rust binary with 33 tools, 11 channels, 16 providers, and 6 sandbox runtimes.
We studied what works — and what doesn't.
OpenClaw proved an AI assistant can handle 12 channels and 100+ skills. But it costs 100MB and 400K lines. NanoClaw proved security-first is possible. But it's still 50MB of TypeScript. NemoClaw proved enterprise governance matters — policy-locked sandboxes, federated inference routing. But it's a 2GB Docker container wrapping OpenClaw underneath, with zero built-in tools. PicoClaw proved AI assistants can run on $10 hardware. But it stripped out everything to get there.
ZeptoClaw took notes. The integrations, the security, the governance, the size discipline — without the tradeoffs each one made. One 6MB Rust binary that starts in 50ms, uses 6MB of RAM, and ships with container isolation, prompt injection detection, and a circuit breaker provider stack.
| | OpenClaw | NemoClaw | NanoClaw | PicoClaw | ZeptoClaw | |---|---|---|---|---|---| | Size | ~100MB | ~2GB (Docker) | ~50MB | <1MB | ~6MB | | Language | JS/TS | JS/TS/Python | TypeScript | Go | | | | 100+ skills | 0 (inference only) | ~20 | ~5 | | | | 5 | NVIDIA-first | 3 | 2 | | | | 12 | 0 (uses OpenClaw) | 3 | 0 | | | | None | Landlock + seccomp | Basic | None | | | | No | No (needs GPU) | No | Yes | |
No comments yet. Be the first to share your thoughts!
AI agents execute code. Most frameworks trust that nothing will go wrong.
The OpenClaw ecosystem has seen CVE-2026-25253 (CVSS 8.8 — cross-site WebSocket hijacking to RCE), ClawHavoc (341 malicious skills, 9,000+ compromised installations), and 42,000 exposed instances with auth bypass. ZeptoClaw was built with this threat model in mind.
| Layer | What it does |
|-------|-------------|
| 6 Sandbox Runtimes | Docker, Apple Container, Landlock, Firejail, Bubblewrap, or native — per request |
| Prompt Injection Detection | Aho-Corasick multi-pattern matcher (17 patterns) + 4 regex rules |
| Secret Leak Scanner | 22 regex patterns catch API keys, tokens, and credentials before they reach the LLM |
| Policy Engine | 7 rules blocking system file access, crypto key extraction, SQL injection, encoded exploits |
| Input Validator | 100KB limit, null byte detection, whitespace ratio analysis, repetition detection |
| Shell Blocklist | Regex patterns blocking reverse shells, rm -rf, privilege escalation |
| SSRF Prevention | DNS pinning, private IP blocking, IPv6 transition guard, scheme validation |
| Chain Alerting | Detects dangerous tool call sequences (write→execute, memory→execute) |
| Tool Approval Gate | Require explicit confirmation before executing dangerous tools |
Every layer runs by default. No flags to remember, no config to enable.
# One-liner (macOS / Linux)
curl -fsSL https://raw.githubusercontent.com/qhkm/zeptoclaw/main/install.sh | sh
# Homebrew
brew install qhkm/tap/zeptoclaw
# Docker
docker pull ghcr.io/qhkm/zeptoclaw:latest
# Build from source
cargo install zeptoclaw --git https://github.com/qhkm/zeptoclaw
The control panel is an optional compile-time feature. To use zeptoclaw panel or
zeptoclaw serve, build/install with --features panel.
# Remove ZeptoClaw state (~/.zeptoclaw)
zeptoclaw uninstall --yes
# Also remove a direct-install binary from ~/.local/bin or /usr/local/bin
zeptoclaw uninstall --remove-binary --yes
# Package-managed installs still use their package manager
brew uninstall qhkm/tap/zeptoclaw
cargo uninstall zeptoclaw
# Interactive setup (walks you through API keys, channels, workspace)
zeptoclaw onboard
# Talk to your agent
zeptoclaw agent -m "Hello, set up my workspace"
# Stream responses token-by-token
zeptoclaw agent --stream -m "Explain async Rust"
# Use a built-in template
zeptoclaw agent --template researcher -m "Search for Rust agent frameworks"
# Process prompts in batch
zeptoclaw batch --input prompts.txt --output results.jsonl
# Start as a Telegram/Slack/Discord/Webhook gateway
zeptoclaw gateway
# With full container isolation per request
zeptoclaw gateway --containerized
Already running OpenClaw? ZeptoClaw can import your config and skills in one command.
# Auto-detect OpenClaw installation (~/.openclaw, ~/.clawdbot, ~/.moldbot)
zeptoclaw migrate
# Specify path manually
zeptoclaw migrate --from /path/to/openclaw
# Preview what would be migrated (no files written)
zeptoclaw migrate --dry-run
# Non-interactive (skip confirmation prompts)
zeptoclaw migrate --yes
The migration command:
~/.zeptoclaw/skills/Supports JSON and JSON5 config files (comments, trailing commas, unquoted keys).
curl -fsSL https://zeptoclaw.com/setup.sh | bash
Installs the binary and prints next steps. Run zeptoclaw onboard to configure providers and channels.
ZeptoClaw supports 16 LLM providers. All OpenAI-compatible endpoints work out of the box.
| Provider | Config key | Setup |
|----------|------------|-------|
| Anthropic | anthropic | api_key |
| OpenAI | openai | api_key |
| OpenRouter | openrouter | api_key |
| Google Gemini | gemini | api_key |
| Groq | groq | api_key |
| DeepSeek | deepseek | api_key |
| xAI (Grok) | xai | api_key |
| NVIDIA NIM | nvidia | api_key |
| Azure OpenAI | azure | api_key + api_base |
| AWS Bedrock | bedrock | api_key |
| Kimi (Moonshot) | kimi | api_key |
| Zhipu (GLM) | zhipu | api_key |
| Qianfan (Baidu) | qianfan | api_key |
| Novita AI | novita | api_key |
| Ollama | ollama | api_key (any value) |
| VLLM | vllm | api_key (any value) |
Configure in ~/.zeptoclaw/config.json or via environment variables:
{
"providers": {
"openrouter": { "api_key": "sk-or-..." },
"ollama": { "api_key": "ollama" }
},
"agents": { "defaults": { "model": "anthropic/claude-sonnet-4" } }
}
export ZEPTOCLAW_PROVIDERS_GROQ_API_KEY=gsk_...
Any provider's base URL can be overridden with api_base for proxies or self-hosted endpoints. See the provider docs for full details.
| Feature | What it does |
|---------|-------------|
| Multi-Provider LLM | 16 providers with SSE streaming, retry with backoff + budget cap, auto-failover |
| 33 Tools + Plugins | Shell, filesystem, grep, find, web, git, stripe, PDF, transcription, Android ADB, and more |
| Tool Composition | Create new tools from natural language descriptions — composable {{param}} templates |
| Agent Swarms | Delegate to sub-agent