by evolsb
AI-powered contract review skill with CUAD risk detection, market benchmarks, and lawyer-ready redlines. Works with Claude Code, Codex, Cursor, and 26+ tools.
# Add to your Claude Code skills
git clone https://github.com/evolsb/claude-legal-skillAI-powered contract review with CUAD risk detection, market benchmarks, and lawyer-ready redlines
Works with: Claude Code · OpenAI Codex · Cursor · GitHub Copilot · Gemini CLI · 26+ tools
# Claude Code
git clone https://github.com/evolsb/claude-legal-skill ~/.claude/skills/contract-review
# OpenAI Codex
git clone https://github.com/evolsb/claude-legal-skill ~/.codex/skills/contract-review
# Other Agent Skills-compatible tools — clone to your tool's skills directory
Review this NDA - I'm the receiving party

I was reviewing real contracts — NDAs, SaaS agreements, M&A docs, merchant agreements — and wanted AI assistance directly in my coding workflow. So I researched what was available:
No comments yet. Be the first to share your thoughts!
Nothing worked as a drop-in skill. So I built one grounded in the CUAD dataset (41 legal risk categories from 510 real contracts), tested it against actual agreements, and iterated until the output was useful for real negotiations.
The result: position-aware review with market benchmarks, document-type checklists, and actual redline language — not just a list of issues.
Analyzes legal contracts and outputs:
This skill outputs structured JSON redlines. To produce the tracked-changes Word docs and redline PDFs that lawyers actually send, pair with legal-redline-tools:
pip install git+https://github.com/evolsb/legal-redline-tools.git
# After the skill generates redlines.json:
legal-redline apply contract.docx redlined.docx \
--from-json redlines.json \
--pdf redline.pdf \
--memo-pdf internal-memo.pdf
Tell it which party you are (customer, vendor, buyer, seller, receiving party) — the skill adjusts what it flags as risky.
Specialized checklists for each contract type:
Compares terms to industry norms with clear thresholds:
| Provision | Standard | Yellow | Red | |-----------|----------|--------|-----| | Liability cap | 12 months | 6-11 mo | <6 mo | | Auto-renewal notice | 90+ days | 60-89 | <60 | | Non-compete | 1-2 years | 3-4 years | 5+ | | Rep survival (M&A) | 12-18 mo | 24-30 mo | 36+ mo |
Tells you what's actually changeable:
Instant detection of danger signs:
Flags when governing law affects enforceability:
Special handling for acquisition agreements:
This skill follows the open Agent Skills standard and works with any compatible tool.
# Claude Code
git clone https://github.com/evolsb/claude-legal-skill ~/.claude/skills/contract-review
# OpenAI Codex
git clone https://github.com/evolsb/claude-legal-skill ~/.codex/skills/contract-review
# Cursor, Copilot, Gemini CLI, etc.
# Clone to your tool's skills directory
git clone https://github.com/evolsb/claude-legal-skill ~/Developer/claude-legal-skill
ln -s ~/Developer/claude-legal-skill ~/.claude/skills/contract-review
Review this NDA for red flags - I'm the receiving party
Analyze the indemnification in this MSA - I'm the vendor
What are the termination provisions? I'm the customer.
Review this acquisition agreement - I'm the seller
Check this merchant agreement - what's my chargeback exposure?
See examples/ for full sample outputs.
Based on ContractEval benchmarks, Claude achieves F1 ~0.62 on clause extraction. Best for first-pass review and issue flagging — not a replacement for attorney review on material deals.
.docx, redline PDFs, and negotiation memos from the skill's outputQuestions or feedback? Open an issue or email chris@ctsheehan.com.
MIT — see LICENSE