by Mng-dev-ai
Your own Claude Code UI, sandbox, in-browser VS Code, terminal, multi-provider support (Anthropic, OpenAI, GitHub Copilot, OpenRouter), custom skills, and MCP servers.
# Add to your Claude Code skills
git clone https://github.com/Mng-dev-ai/claudexSelf-hosted Claude Code workspace with multi-provider routing, sandboxed execution, and a full web IDE.
Note: Claudex is under active development. Expect breaking changes between releases.
Join the Discord server.
React/Vite Frontend
-> FastAPI Backend
-> PostgreSQL + Redis (web/docker mode)
-> SQLite + in-memory cache/pubsub (desktop mode)
-> Sandbox runtime (Docker/Host)
-> Claude Code CLI + claude-agent-sdk
Claudex runs chats through , which drives the Claude Code CLI in the selected sandbox. This keeps Claude Code-native behavior for tools, session flow, permission modes, and MCP orchestration.
No comments yet. Be the first to share your thoughts!
claude-agent-sdkFor OpenAI, OpenRouter, and Copilot providers, Claudex starts anthropic-bridge inside the sandbox and routes Claude Code requests through:
ANTHROPIC_BASE_URL=http://127.0.0.1:3456OPENROUTER_API_KEY and GITHUB_COPILOT_TOKENopenai/gpt-5.2-codex, openrouter/moonshotai/kimi-k2.5, copilot/gpt-5.2-codexClaudex UI
-> Claude Agent SDK + Claude Code CLI
-> Anthropic-compatible request shape
-> Anthropic Bridge (OpenAI/OpenRouter/Copilot)
-> Target provider model
For Anthropic providers, Claudex uses your Claude auth token directly. For custom providers, Claudex calls your configured Anthropic-compatible base_url.
claude-agent-sdkopenai/*, openrouter/*, copilot/*)Workspaces are the top-level organizational unit. Each workspace owns a dedicated sandbox and groups all related chats under one project context.
Each workspace gets its own sandbox instance (Docker container or host process). Chats within a workspace share the same filesystem, installed tools, and .claude configuration. Switching between workspaces switches the entire execution environment.
When creating a workspace you can override the default sandbox provider (Docker or Host). The provider is locked at creation time — all chats in that workspace use the same provider.
git clone https://github.com/Mng-dev-ai/claudex.git
cd claudex
docker compose -p claudex-web -f docker-compose.yml up -d
Open http://localhost:3000.
docker compose -p claudex-web -f docker-compose.yml down
docker compose -p claudex-web -f docker-compose.yml logs -f
Desktop mode uses Tauri with a bundled Python backend sidecar on localhost:8081, with local SQLite storage.
When running in desktop mode:
8081Tauri Desktop App
-> React frontend (.env.desktop)
-> bundled backend sidecar (localhost:8081)
-> local SQLite database
Requirements:
Dev workflow:
cd frontend
npm install
npm run desktop:dev
Build (unsigned dev):
cd frontend
npm run desktop:build
App bundle output:
frontend/src-tauri/target/release/bundle/macos/Claudex.appDesktop troubleshooting:
8081 if already in useConfigure providers in Settings -> Providers.
anthropic: paste token from claude setup-tokenopenai: authenticate with OpenAI device flow in UIcopilot: authenticate with GitHub device flow in UIopenrouter: add OpenRouter API key and model IDscustom: set Anthropic-compatible base_url, token, and model IDsgpt-5.2-codex, gpt-5.2, gpt-5.3-codexmoonshotai/kimi-k2.5, minimax/minimax-m2.1, google/gemini-3-pro-previewGLM-5, M2.5, or private org-specific endpoints (depends on your backend compatibility)Switching providers within a workspace does not require a new workflow:
.claude resources (skills, agents, commands)This is the main value of using Claude Code as the harness while changing inference providers behind Anthropic Bridge.
3000808054326379590060808765GET /healthGET /api/v1/readyz
/ and API under /api/*

Apache 2.0. See LICENSE.
Contributions are welcome. Please open an issue first to discuss what you would like to change, then submit a pull request.