Open Source Implementation of Karpathy's LLM Wiki. Upload documents, connect your Claude account via MCP, and have it write your wiki !
# Add to your Claude Code skills
git clone https://github.com/lucasastorian/llmwikiFree, open-source implementation of Karpathy's LLM Wiki (spec). Available at llmwiki.app.

| Layer | Description | |-------|-------------| | Raw Sources | PDFs, articles, notes, transcripts. Your immutable source of truth. The LLM reads them but never modifies them. | | The Wiki | LLM-generated markdown pages — summaries, entity pages, cross-references, mermaid diagrams, tables. The LLM owns this layer. You read it; the LLM writes it. | | The Tools | Search, read, and write. Claude connects via MCP and orchestrates the rest. |
LLM Wiki ships an MCP server that Claude.ai connects to directly. Once connected, Claude has tools to search, read, write, and delete across your entire knowledge vault. All operations below happen through Claude — you talk to it, it maintains the wiki.
— Drop a source in. Claude reads it, writes a summary, updates entity and concept pages across the wiki, and flags anything that contradicts existing knowledge. A single source might touch 10-15 wiki pages.
No comments yet. Be the first to share your thoughts!
Query — Ask complex questions against the compiled wiki. Knowledge is already synthesized — not re-derived from raw chunks each time. Good answers get filed back as new pages, so your explorations compound.
Lint — Run health checks. Find inconsistent data, stale claims, orphan pages, missing cross-references. Claude suggests new questions to investigate and new sources to look for.
┌─────────────┐ ┌─────────────┐ ┌─────────────┐
│ Next.js │────▶│ FastAPI │────▶│ Supabase │
│ Frontend │ │ Backend │ │ (Postgres) │
└─────────────┘ └──────┬──────┘ └─────────────┘
│
┌──────┴──────┐
│ MCP Server │◀──── Claude
└─────────────┘
| Component | Stack | Responsibilities |
|-----------|-------|------------------|
| Web (web/) | Next.js 16, React 19, Tailwind, Radix UI | Dashboard, PDF/HTML viewer, wiki renderer, onboarding |
| API (api/) | FastAPI, asyncpg, aioboto3 | Auth, uploads (TUS), document processing, OCR (Mistral) |
| Converter (converter/) | FastAPI, LibreOffice | Isolated office-to-PDF conversion (non-root, zero AWS creds) |
| MCP (mcp/) | MCP SDK, Supabase OAuth | Tools for Claude: guide, search, read, write, delete |
| Database | Supabase (Postgres + RLS + PGroonga) | Documents, chunks, knowledge bases, users |
| Storage | S3-compatible | Raw uploads, tagged HTML, extracted images |
Once connected, Claude has full access to your knowledge vault:
| Tool | Description |
|------|-------------|
| guide | Explains how the wiki works and lists available knowledge bases |
| search | Browse files (list) or keyword search with PGroonga ranking (search) |
| read | Read documents — PDFs with page ranges, inline images, glob batch reads |
| write | Create wiki pages, edit with str_replace, append. SVG and CSV asset support |
| delete | Archive documents by path or glob pattern |
The fastest way to try LLM Wiki:
That's it. No local setup required.
psql $DATABASE_URL -f supabase/migrations/001_initial.sql
Or use local Docker: docker compose up -d
cd api
python -m venv .venv && source .venv/bin/activate
pip install -r requirements.txt
cp ../.env.example .env # edit with your credentials
uvicorn main:app --reload --port 8000
cd mcp
python -m venv .venv && source .venv/bin/activate
pip install -r requirements.txt
uvicorn server:app --reload --port 8080
cd web
npm install
cp .env.example .env.local
npm run dev
http://localhost:8080/mcpAPI (api/.env)
DATABASE_URL=postgresql://...
SUPABASE_URL=https://your-ref.supabase.co
SUPABASE_JWT_SECRET= # optional, for legacy HS256 projects
MISTRAL_API_KEY= # for PDF OCR
AWS_ACCESS_KEY_ID=
AWS_SECRET_ACCESS_KEY=
AWS_REGION=us-east-1
S3_BUCKET=your-bucket
APP_URL=http://localhost:3000
CONVERTER_URL= # optional, URL of isolated converter service
Web (web/.env.local)
NEXT_PUBLIC_SUPABASE_URL=https://your-ref.supabase.co
NEXT_PUBLIC_SUPABASE_ANON_KEY=your-anon-key
NEXT_PUBLIC_API_URL=http://localhost:8000
NEXT_PUBLIC_MCP_URL=http://localhost:8080/mcp
The tedious part of maintaining a knowledge base is not the reading or the thinking — it's the bookkeeping. Updating cross-references, keeping summaries current, noting when new data contradicts old claims, maintaining consistency across dozens of pages.
Humans abandon personal wikis because the maintenance burden grows faster than the value. LLMs don't get bored, don't forget to update a cross-reference, and can touch 15 files in one pass. The wiki stays maintained because the cost of maintenance drops to near zero.
The human's job is to curate sources, direct the analysis, ask good questions, and think about what it all means. The LLM's job is everything else.
Apache 2.0