๐ฆ Personal AI assistant platform, rewritten in Rust from OpenClaw
# Add to your Claude Code skills
git clone https://github.com/opencrust-org/opencrustA single 16 MB binary that runs your AI agents across Telegram, Discord, Slack, WhatsApp, WhatsApp Web, LINE, WeChat, iMessage and MQTT - with encrypted credential storage, config hot-reload, and 13 MB of RAM at idle. Built in Rust for the security and reliability that AI agents demand.
# Install (Linux, macOS)
curl -fsSL https://raw.githubusercontent.com/opencrust-org/opencrust/main/install.sh | sh
# Interactive setup - pick your LLM provider and channels
opencrust init
# Start - on first message, the agent will introduce itself and learn your preferences
opencrust start
# Diagnose configuration, connectivity, and database health
opencrust doctor
# Requires Rust 1.85+
cargo build --release
./target/release/opencrust init
./target/release/opencrust start
# Optional: include WASM plugin support
cargo build --release --features plugins
Once the gateway is running, open your browser at:
http://127.0.0.1:3888
The built-in web UI lets you chat with your agent, switch LLM providers on the fly, manage MCP servers, and monitor connected channels โ all without restarting.
Authentication โ if
api_keyis set inconfig.yml, the UI will prompt for the gateway key before connecting.
Chat with your agent directly from the terminal โ no browser needed.
Requires a running gateway. Run
opencrust init(first time only) thenopencrust startbefore usingopencrust chat.
# First-time setup
opencrust init
opencrust start # or: opencrust start -d (daemon mode)
# Open terminal chat
opencrust chat
opencrust chat --agent coder # start with a named agent
opencrust chat --url http://host:3888 # connect to a remote gateway
No comments yet. Be the first to share your thoughts!
โญโโโ OpenCrust Chat v0.2.9 โโโโโโโโโโโโโโโโโโโโโโโฎ
โ โ
โ _~^~^~_ โ
โ \) / o o \ (/ โ
โ '_ - _' โ
โ / '-----' \ โ
โ โ
โ Gateway http://127.0.0.1:3888 โ
โ Agent default โ
โ โ
โ Type /help for commands โ
โ โ
โฐโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฏ
you โบ what's the difference between Go channels and Rust async?
bot โบ Go channels are built into the language โ goroutines communicate
by passing values through typed channels (make(chan int, 5)).
Rust async uses Future + tokio::sync::mpsc for similar patterns,
but ownership is enforced at compile time, so data races are
impossible without unsafe.
you โบ /agent coder
Switched to agent: coder.
you โบ show me a rust mpsc example
bot โบ use tokio::sync::mpsc;
#[tokio::main]
async fn main() {
let (tx, mut rx) = mpsc::channel(8);
tokio::spawn(async move { tx.send(42).await.unwrap(); });
println!("{}", rx.recv().await.unwrap()); // 42
}
you โบ /exit
Goodbye!
Chat commands: /help ยท /new (fresh session) ยท /agent <id> ยท /clear ยท /exit
Pre-compiled binaries for Linux (x86_64, aarch64), macOS (Intel, Apple Silicon), and Windows (x86_64) are available on GitHub Releases.
| | OpenCrust | OpenClaw (Node.js) | ZeroClaw (Rust) |
|---|---|---|---|
| Binary size | 16 MB | ~1.2 GB (with node_modules) | ~25 MB |
| Memory at idle | 13 MB | ~388 MB | ~20 MB |
| Cold start | 3 ms | 13.9 s | ~50 ms |
| Credential storage | AES-256-GCM encrypted vault | Plaintext config file | Plaintext config file |
| Auth default | Enabled (WebSocket pairing) | Disabled by default | Disabled by default |
| Scheduling | Cron, interval, one-shot | Yes | No |
| Multi-agent routing | Yes (named agents) | Yes (agentId) | No |
| Session orchestration | Yes | Yes | No |
| MCP support | Stdio + HTTP | Stdio + HTTP | Stdio |
| Channels | 9 | 6+ | 4 |
| LLM providers | 15 | 10+ | 22+ |
| Pre-compiled binaries | Yes | N/A (Node.js) | Build from source |
| Config hot-reload | Yes | No | No |
| WASM plugin system | Optional (sandboxed) | No | No |
| Self-update | Yes (opencrust update) | npm | Build from source |
Benchmarks measured on a 1 vCPU, 1 GB RAM DigitalOcean droplet.
OpenCrust is built for the security requirements of always-on AI agents that access private data and communicate externally.
~/.opencrust/credentials/vault.json. Never plaintext on disk.--features plugins).127.0.0.1 by default, not 0.0.0.0.Native providers:
base_urlOpenAI-compatible providers:
tts-1, tts-1-hd), any OpenAI-compatible endpointauto_reply_voice: true synthesizes every text response as audio automaticallytts_max_chars limits synthesis length; long responses are truncated with a warning