by Corbell-AI
AI-powered spec generation and review using multi-repo code graph intelligence for backend teams that ship to production.
# Add to your Claude Code skills
git clone https://github.com/Corbell-AI/CorbellYou're a staff engineer or architect at a company where features touch 5–10 repositories. Every quarter your team re-litigates the same architectural decisions: "should we use Kafka or SQS?", "why do we have three different auth patterns?", "who owns the rate-limiting layer?"
The decisions live in Confluence pages nobody reads, Slack messages nobody can find, and the memories of engineers who've since left.
When a new engineer joins—or even when you return to a service you haven't touched in 6 months—you're starting from scratch.
Corbell gives your team a living knowledge graph of your architecture — built from the actual code in your repos and your team's past design docs. When you need a new spec, Corbell generates one that respects your established patterns instead of inventing new ones. When you push to Linear, each task carries the exact method signatures, call paths, and cross-service impacts an AI coding agent needs to work autonomously.
Your repos → [graph:build] → Service graph (SQLite)
Your docs → [docs:scan] → Design pattern extraction
↓
[spec new --feature "Payment Retry" --prd-file prd.md]
→ Generates 3-4 PRD-driven code search queries (LLM or regex)
→ Auto-discovers relevant services via embedding similarity
→ Injects graph topology + real code snippets
→ Applies your team's established design patterns
→ Calls Claude / GPT-4o to write the full design doc
→ Displays token usage and estimated cost
→ specs/payment-retry.md ✓
↓
[spec review] → Checks claims against graph → .review.md
[spec decompose] → Parallel task tracks YAML
[export linear] → Linear issues with full method/service context
No servers. No cloud setup. Runs entirely from your laptop against local repos.
pip install corbell
# With LLM support (pick one):
pip install "corbell[anthropic]" # Claude (recommended)
pip install "corbell[openai]" # GPT-4o
# With exports:
pip install "corbell[notion,linear]"
# Everything:
pip install "corbell[anthropic,openai,notion,linear]"
Requirements: Python ≥ 3.11
Initialize Corbell workspace
corbell init
✅ Creates workspace.yaml in your project root
Generate your first design document
corbell spec new --prd "Add user authentication feature"
✅ Creates design document with auto-discovered services
View architecture graph (optional)
corbell ui serve
✅ Opens browser at http://localhost:7433
workspace.yaml exists in your projectNeed more details? See Full Documentation below.
cd ~/my-platform # wherever your repos live
corbell init
Edit corbell-data/workspace.yaml:
workspace:
name: my-platform
services:
- id: payments-service
repo: ../payments-service
language: python # python | javascript | typescript | go | java
- id: auth-service
repo: ../auth-service
language: go
llm:
provider: anthropic # or: openai, ollama
model: claude-sonnet-4-5-20250929
api_key: ${ANTHROPIC_API_KEY}
corbell graph build --methods # service + dependency graph + call graph, typed signatures, flows
corbell embeddings build # code chunk index for semantic search
corbell docs scan && corbell docs learn # extract patterns from existing RFCs/ADRs
export ANTHROPIC_API_KEY="sk-ant-..."
# From a PRD file — services are auto-discovered, no --service flag needed
corbell spec new \
--feature "Payment Retry with Exponential Backoff" \
--prd-file docs/payment-retry-prd.md
# Spec with full call graph and infrastructure context
corbell spec new --feature "Auth Flow" --prd-file prd.md --full-graph
# Inline PRD
corbell spec new --feature "Rate Limiting" --prd "Tier 1: 100 req/min..."
# Document your existing codebase with no PRD at all
corbell spec new --existing
# Add existing design docs as context (ADRs, Confluence exports, RFCs)
corbell spec new --feature "Auth Token Refresh" --prd-file prd.md \
--design-doc docs/auth-design-2023.md
Token usage and estimated cost are shown after every LLM call. Template mode (no LLM key) generates a structured skeleton with graph context filled in.
Add a constraints block to any spec and all future specs will respect it:
<!-- CORBELL_CONSTRAINTS_START -->
- **Cloud provider**: Only Azure — no AWS services permitted
- **Latency SLO**: p99 < 200ms for all synchronous API calls
- **Security**: All PII encrypted at rest (AES-256) and in transit (TLS 1.2+)
<!-- CORBELL_CONSTRAINTS_END -->
corbell spec review checks proposed designs against these constraints. The corbell ui serve graph browser also surfaces them in a persistent bar at the bottom.
corbell spec review specs/payment-retry.md # → .review.md sidecar
corbell spec approve specs/payment-retry.md
corbell spec decompose specs/payment-retry.md # → .tasks.yaml
export CORBELL_LINEAR_API_KEY="lin_api_..."
corbell export linear specs/payment-retry.tasks.yaml
corbell ui serve # opens http://localhost:7433 · Ctrl+C to stop
corbell ui serve --port 8080 --no-browser
An interactive local graph view — no cloud, no sign-in, reads from your existing SQLite store:
LoginFlow), git change coupling pairs with strength %.CORBELL_CONSTRAINTS_START blocks from your spec files shown as persistent amber pills at the bottom. Click to expand.graph Service dependency graph
build --methods for call graph + typed signatures + git coupling + flows
services List discovered services
deps Show service dependencies
callpath Find call paths between methods
embeddings Code embedding index
build Index code chunks
query Semantic search
docs Design doc patterns
scan / learn / patterns
spec Design spec lifecycle
new --feature --prd-file --prd --design-doc --existing --no-llm
lint Validate structure (--ci exits 1)
review Check spec vs graph → .review.md
approve / decompose / context
export notion | linear
ui Architecture graph browser
serve --port (default 7433) --no-browser
mcp Model Context Protocol server
serve stdio transport for Claude Desktop / Cursor
init Create workspace.yaml
Corbell exposes its architecture graph, code embeddings, and spec tools via MCP, so external AI platforms (Cursor, Claude Desktop, Antigravity) can query your codebase context directly.
| Tool | Description |
|---|---|
| graph_query | Query service dependencies, methods, and call paths |
| get_architecture_context | Auto-discover relevant services for a feature description |
| code_search | Semantic search across the code embedding index |
| list_services | List all services in the workspace graph |
# Default: stdio transport (for IDE integrations)
corbell mcp serve
# SSE transport (for web-based MCP clients / MCP Inspector)
corbell mcp serve --transport sse --port 8000
Cursor (~/.cursor/mcp.json):
{
"mcpServers": {
"corbell": {
"command": "corbell",
"args": ["mcp", "serve"]
}
}
}
Claude Desktop (~/Library/Application Support/Claude/claude_desktop_config.json):
{
"mcpServers": {
"corbell": {
"command": "corbell",
"args": ["mcp", "serve"]
}
}
}
If your IDE overrides the working directory, set the CORBELL_WORKSPACE environment variable:
env CORBELL_WORKSPACE=/path/to/my-platform corbell mcp serve
When you run corbell spec new, Corbel
No comments yet. Be the first to share your thoughts!