by clawdotnet
Self-hosted OpenClaw gateway + agent runtime in .NET (NativeAOT-friendly)
# Add to your Claude Code skills
git clone https://github.com/clawdotnet/openclaw.netDisclaimer: This project is not affiliated with, endorsed by, or associated with OpenClaw. It is an independent .NET implementation inspired by their work.
OpenClaw.NET is a NativeAOT-friendly AI agent runtime and gateway for .NET with practical OpenClaw ecosystem compatibility.
It is for .NET developers and operators who want a local or self-hosted agent gateway with explicit diagnostics, first-party .NET tools, OpenAI-compatible HTTP surfaces, and a path from source checkout to NativeAOT release artifacts.
Docs: AgentQi.dev is the documentation and ecosystem home for OpenClaw.NET. OpenClaw.NET remains the current runtime and repository identity.
AgentQi is the broader developer-infrastructure direction behind OpenClaw.NET: practical, observable, self-hosted AI agent systems for .NET developers.
OpenClaw.NET is the runtime and repository you can use today. AgentQiX is the likely future runtime identity.
Start here:
No comments yet. Be the first to share your thoughts!
Runtime.Orchestrator=maf without a special buildmaf-durable-httpSKILL.md packagesStart with docs/START_HERE.md for the evaluator overview, docs/QUICKSTART.md for the supported local setup path, or docs/RELEASES.md for desktop downloads.
For Microsoft Agent Framework, A2A, and durable workflow setup, see docs/integrations/microsoft-agent-framework.md, docs/a2a.md, and docs/workflow-backends.md.
For the lowest-friction desktop start, download the latest desktop bundle for your platform:
| Platform | Download | |----------|----------| | Windows x64 | openclaw-desktop-win-x64.zip | | Apple Silicon macOS | openclaw-desktop-osx-arm64.zip | | Linux x64 | openclaw-desktop-linux-x64.zip |
Each desktop bundle includes Companion, the NativeAOT gateway, and the NativeAOT CLI.
companion folder.Companion writes a local config, starts the bundled gateway on 127.0.0.1, and connects to it. The current Windows and macOS release archives are unsigned, so first-run OS warnings are expected. See docs/RELEASES.md for checksums, standalone CLI/gateway archives, signing status, and maintainer release flow.
For a real local gateway from source:
export MODEL_PROVIDER_KEY="sk-..."
dotnet run --project src/OpenClaw.Cli -c Release -- start
When the gateway finishes startup it now prints explicit phase markers, a final OpenClaw gateway ready. block, the localhost URLs, Ctrl-C to stop, and any non-fatal startup notices under Started with notices:. Then open:
| Surface | URL |
|---------|-----|
| Web UI / Live Chat | http://127.0.0.1:18789/chat |
| Admin UI | http://127.0.0.1:18789/admin |
| Integration API | http://127.0.0.1:18789/api/integration/status |
| MCP endpoint | http://127.0.0.1:18789/mcp |
The root URL redirects to /chat. For the full first-run walkthrough (including the "First 10 Minutes" runbook and debugging flow), see docs/QUICKSTART.md. For the project shape and repository map before changing code, see docs/GETTING_STARTED.md.
If you want a direct gateway fallback instead of the full CLI onboarding flow, run:
dotnet run --project src/OpenClaw.Gateway -c Release -- --quickstart
--quickstart is interactive-only. It applies a minimal loopback-local profile for the current process, prompts for missing provider inputs, retries on the common first-run failures, and after a successful start can save the working setup to ~/.openclaw/config/openclaw.settings.json.
If the CLI is already on your PATH, the same guided entrypoints are:
openclaw start
openclaw setup
openclaw setup launch --config ~/.openclaw/config/openclaw.settings.json
openclaw setup service --config ~/.openclaw/config/openclaw.settings.json --platform all
openclaw setup status --config ~/.openclaw/config/openclaw.settings.json
openclaw upgrade check --config ~/.openclaw/config/openclaw.settings.json
openclaw upgrade rollback --config ~/.openclaw/config/openclaw.settings.json --offline
Useful follow-up commands and surfaces:
openclaw models presets
openclaw models packages
openclaw models install gemma-4-e4b --accept-license --path ~/Downloads/gemma-4-E4B-it-Q4_K_M.gguf --mmproj-path ~/Downloads/mmproj-gemma-4-E4B-it-Q8_0.gguf
openclaw models doctor
openclaw maintenance scan --config ~/.openclaw/config/openclaw.settings.json
openclaw maintenance fix --config ~/.openclaw/config/openclaw.settings.json --dry-run
openclaw skills inspect ./skills/my-skill
openclaw compatibility catalog
openclaw insights
openclaw admin trajectory export --anonymize --output ./trajectory.jsonl
openclaw upgrade check --config ~/.openclaw/config/openclaw.settings.json --offline
openclaw upgrade rollback --config ~/.openclaw/config/openclaw.settings.json --offline
openclaw migrate upstream --source ./upstream-agent --target-config ~/.openclaw/config/openclaw.settings.json
/admin/skills/admin/maintenance/admin/observability/summary/admin/insights/admin/audit/export/admin/trajectory/exportFor local Ollama setups, prefer the native root endpoint and an explicit preset:
openclaw setup --non-interactive --profile local --workspace ./workspace --provider ollama --model llama3.2 --model-preset ollama-general
OpenClaw.NET now treats Ollama as a first-class native provider at http://127.0.0.1:11434. Older /v1 endpoints still work for one compatibility cycle, but openclaw models doctor will flag them so you can migrate cleanly.
For OpenClaw-managed local inference, use provider embedded with an installable package. Gemma 4 is now the main embedded local model path:
openclaw models packages
openclaw models install gemma-4-e4b \
--accept-license \
--path ~/Downloads/gemma-4-E4B-it-Q4_K_M.gguf \
--mmproj-path ~/Downloads/mmproj-gemma-4-E4B-it-Q8_0.gguf
openclaw setup --provider embedded --model-preset embedded-gemma-4-e4b --model gemma-4-e4b
openclaw models status gemma-4-e4b
The package catalog includes Gemma 4 E2B, E4B, 31B, and 26B-A4B GGUF entries, plus the experimental Gemma 4 E2B LiteRT-LM package for adapter work. The older gemma-local-small-q4 Gemma 3 package remains available for smaller machines.
Embedded video support is frame-based: OpenClaw samples local video/* inputs into ordered image frames before calling the local sidecar, and Gemma 4 GGUF packages include the multimodal projector file needed for image-frame inputs. LiteRT-LM packages are experimental and require an OpenClaw-compatible adapter binary; see docs/LOCAL_MODELS.md.
Breaking change: browser admin usage is account/session-first. Use named operator accounts for
/admin, and use operator account tokens for Companion, CLI, API, and websocket clients.
When binding to a non-loopback address, the gateway refuses to start unless dangerous settings are explicitly hardened (auth token required, tool