by context-hub
CTX: a tool that solves the context management gap when working with LLMs like ChatGPT or Claude. It helps developers organize and automatically collect information from their codebase into structured documents that can be easily shared with AI assistants.
# Add to your Claude Code skills
git clone https://github.com/context-hub/generatorMCP-powered development toolkit that gives AI full access to your codebase
CTX is a single ~20 MB binary with zero dependencies. No Node.js, no Python, no runtime — just download, connect to your MCP client, and start coding with AI.
Connect it to Claude Desktop, Cursor, Cline, or any MCP-compatible client — and your AI gets direct access to read, write, search, and modify files across your projects.
CTX is designed with Claude Desktop in mind and works best with it. Claude's deep understanding of code combined with CTX's filesystem tools, custom commands, and multi-project support creates a seamless development experience — like having a senior developer who knows your entire codebase sitting right next to you.
CTX provides a built-in MCP server with powerful filesystem tools:
No comments yet. Be the first to share your thoughts!
Define project-specific commands that AI can execute through MCP:
tools:
- id: run-tests
description: "Run project tests with coverage"
type: run
commands:
- cmd: vendor/bin/phpunit
args: [ "--coverage-html", "logs/coverage" ]
- id: deploy
description: "Deploy to staging"
type: run
schema:
type: object
properties:
branch:
type: string
default: "main"
commands:
- cmd: ./deploy.sh
args: [ "{{branch}}" ]
Tests, migrations, linters, deployments — anything your terminal can run, AI can trigger.
Work across multiple microservices simultaneously. AI sees all your projects and can develop cross-cutting features:
projects:
- name: backend-api
path: ../backend
description: "REST API service"
- name: auth-service
path: ../auth
description: "Authentication microservice"
- name: shared-lib
path: ../packages/shared
description: "Shared domain models"
Start a session, ask AI to list available projects, and develop features that span multiple services — all in one conversation.
Define exactly what context your AI needs. CTX collects code from files, git diffs, GitHub repos, URLs, and more — then structures it into clean markdown documents:
documents:
- description: User Authentication System
outputPath: auth.md
sources:
- type: file
sourcePaths: [ src/Auth ]
filePattern: "*.php"
- type: git_diff
commit: "last-week"
Everything is configured through context.yaml with full JSON Schema support. Your AI assistant can generate and modify
these configs for you — just describe what you need.
ctx init # Generate initial config
ctx generate # Build context documents
ctx server # Start MCP server
Linux / WSL:
curl -sSL https://raw.githubusercontent.com/context-hub/generator/main/download-latest.sh | sh
Windows:
powershell -c "& ([ScriptBlock]::Create((irm 'https://raw.githubusercontent.com/context-hub/generator/main/download-latest.ps1'))) -AddToPath"
The fastest way — auto-detect OS and configure your MCP client:
ctx mcp:config
Or add manually to your MCP client config:
{
"mcpServers": {
"ctx": {
"command": "ctx",
"args": [
"server"
]
}
}
}
For a specific project:
{
"mcpServers": {
"ctx": {
"command": "ctx",
"args": [
"server",
"-c",
"/path/to/project"
]
}
}
}
That's it. Your AI assistant now has full access to your project through MCP.
If you also want to generate static context files for copy-paste workflows:
cd your-project
ctx init # Create context.yaml
ctx generate # Build markdown contexts
CTX operates in two modes that complement each other:
MCP Server Mode — AI interacts with your codebase in real-time:
AI Assistant ←→ CTX MCP Server ←→ Your Project Files
↕
Custom Tools (tests, deploy, lint...)
Multiple Projects
Context Documents
Context Generation Mode — build structured documents for any LLM:
context.yaml → Sources → Filters → Modifiers → Markdown Documents
Connect CTX to Claude Desktop or Cursor. Ask your AI to explore the codebase, understand architecture, write new features, run tests, and fix issues — all without leaving the conversation.
Working on a feature that touches multiple microservices? Register all projects, and AI can read code from one service, understand shared models, and implement changes across the entire stack.
Generate context documents with recent git diffs, relevant source files, and architecture overview. Share with reviewers or AI assistants for thorough, informed reviews.
New team member? Generate a comprehensive project overview — architecture, key interfaces, domain models — in seconds. AI can then answer questions about the codebase with full context.