by fabio-rovai
AI-native ontology engine: a Rust MCP server with tools for building, validating, querying, and reasoning over RDF/OWL ontologies. In-memory Oxigraph triple store, native OWL2-DL tableaux reasoner, SHACL validation, SPARQL, versioning. Single binary, no JVM.
# Add to your Claude Code skills
git clone https://github.com/fabio-rovai/open-ontologiesGuides for using mcp servers skills like open-ontologies.
name: open-ontologies version: "0.5.1" description: > AI-native ontology engineering using 50+ MCP tools backed by an in-memory Oxigraph triple store. Build, validate, query, and govern RDF/OWL ontologies with a generate-validate-iterate loop. Use when building ontologies, knowledge graphs, RDF data, SPARQL queries, BORO/4D modeling, SHACL validation, clinical terminology mapping, ingesting from CSV/JSON/Parquet/XLSX or SQL backbones (PostgreSQL, DuckDB), or Terraform-style ontology lifecycle management. tags:
AI-native ontology engineering. Generate OWL/RDF directly, validate with MCP tools, iterate until clean, govern with a Terraform-style lifecycle.
This skill requires the Open Ontologies MCP server to provide the onto_* tools.
Install: cargo install open-ontologies or download from GitHub releases
MCP config (add to .mcp.json or Claude settings):
Open Ontologies is a Rust MCP server and desktop Studio for AI-native ontology engineering. It exposes 43 tools that let Claude build, validate, query, diff, lint, version, reason over, align, and persist RDF/OWL ontologies using an in-memory Oxigraph triple store — with Terraform-style lifecycle management, a marketplace of 32 standard ontologies, clinical crosswalks, semantic embeddings, and a full lineage audit trail.
The Studio wraps the engine in a visual desktop environment: virtualized ontology tree with hierarchy lines, breadcrumb navigation, and connection explorer; AI chat panel with /build (IES-level deep) and /sketch (quick prototype) commands; Protégé-style property inspector; and lineage viewer.
No JVM. No Protégé.
Pre-built binaries:
# macOS (Apple Silicon)
curl -LO https://github.com/fabio-rovai/open-ontologies/releases/latest/download/open-ontologies-aarch64-apple-darwin
chmod +x open-ontologies-aarch64-apple-darwin && mv open-ontologies-aarch64-apple-darwin /usr/local/bin/open-ontologies
# macOS (Intel)
curl -LO https://github.com/fabio-rovai/open-ontologies/releases/latest/download/open-ontologies-x86_64-apple-darwin
chmod +x open-ontologies-x86_64-apple-darwin && mv open-ontologies-x86_64-apple-darwin /usr/local/bin/open-ontologies
# Linux (x86_64)
curl -LO https://github.com/fabio-rovai/open-ontologies/releases/latest/download/open-ontologies-x86_64-unknown-linux-gnu
chmod +x open-ontologies-x86_64-unknown-linux-gnu && mv open-ontologies-x86_64-unknown-linux-gnu /usr/local/bin/open-ontologies
Docker:
docker pull ghcr.io/fabio-rovai/open-ontologies:latest
docker run -i ghcr.io/fabio-rovai/open-ontologies serve
No comments yet. Be the first to share your thoughts!
Top skills in this category by stars
{
"mcpServers": {
"open-ontologies": {
"command": "open-ontologies",
"args": ["serve"]
}
}
}
No credentials needed. All processing runs locally in an in-memory Oxigraph triple store. Network access is only used when you explicitly call onto_pull (fetch remote ontology) or onto_push (send to SPARQL endpoint) with a URL you provide. Monitor alerts (onto_monitor) are logged to stdout only.
When building or modifying ontologies, follow this workflow. Decide which tools to call and in what order based on results -- this is not a fixed pipeline.
onto_validate on the generated Turtle -- if it fails, fix syntax errors and re-validateonto_load to load into the Oxigraph triple storeonto_stats to verify class count, property count, triple count match expectationsonto_lint to check for missing labels, comments, domains, ranges -- fix any issues foundonto_query with SPARQL to verify structure (expected classes, subclass hierarchies, competency questions)onto_diff to compareonto_save to write the final ontology to a .ttl fileonto_version to save a named snapshot for rollbackFor evolving ontologies in production:
onto_plan shows added/removed classes, blast radius, risk score. Check onto_lock for protected IRIs.onto_enforce with a rule pack (generic, boro, value_partition) checks design pattern compliance.onto_apply with mode safe (clear + reload) or migrate (add owl:equivalentClass bridges).onto_monitor runs SPARQL watchers with threshold alerts. Use onto_monitor_clear if blocked.onto_drift compares versions with rename detection and self-calibrating confidence.When applying an ontology to external data:
onto_map -- generate mapping config from data schema + loaded ontologyonto_ingest -- parse structured data (CSV, JSON, NDJSON, XML, YAML, XLSX, Parquet) into RDFonto_shacl -- validate against SHACL shapes (cardinality, datatypes, classes)onto_reason -- run RDFS or OWL-RL inference, materializing inferred triplesonto_extend to run the full pipeline: ingest, SHACL validate, reason in one callFor healthcare ontologies:
onto_crosswalk -- look up mappings between ICD-10, SNOMED CT, and MeSHonto_enrich -- add skos:exactMatch triples linking classes to clinical codesonto_validate_clinical -- check class labels against clinical crosswalk terminologyFor aligning two ontologies:
onto_align -- detect alignment candidates (equivalentClass, exactMatch, subClassOf) using 6 weighted signalsonto_align_feedback -- accept/reject candidates to self-calibrate confidence weights| Tool | When to use |
| ---- | ----------- |
| onto_validate | After generating or modifying Turtle -- always validate first |
| onto_load | After validation passes -- loads into triple store |
| onto_stats | After loading -- sanity check on counts |
| onto_lint | After loading -- catches missing labels, domains, ranges |
| onto_query | Verify structure, answer competency questions |
| onto_diff | Compare against a reference or previous version |
| onto_save | Persist ontology to a file |
| onto_convert | Convert between formats (Turtle, N-Triples, RDF/XML, N-Quads, TriG) |
| onto_clear | Reset the store before loading a different ontology |
| onto_pull | Fetch ontology from a remote URL or SPARQL endpoint |
| onto_push | Push ontology to a SPARQL endpoint |
| onto_import | Resolve and load owl:imports chains |
| onto_version | Save a named snapshot before making changes |
| onto_history | List saved version snapshots |
| onto_rollback | Restore a previous version |
| onto_ingest | Parse structured data into RDF and load into store |
| onto_map | Generate mapping config from data schema + ontology |
| onto_shacl | Validate data against SHACL shapes |
| onto_reason | Run RDFS or OWL-RL inference |
| onto_extend | Full pipeline: ingest, SHACL validate, reason |
| onto_plan | Show added/removed classes, blast radius, risk score |
| onto_apply | Apply changes in safe or migrate mode |
| onto_lock | Protect production IRIs from removal |
| onto_drift | Compare versions with rename detection |
| onto_enforce | Design pattern checks: generic, boro, value_partition, or custom |
| onto_monitor | Run SPARQL watchers with threshold alerts |
| onto_monitor_clear | Clear blocked state after resolving alerts |
| onto_crosswalk | Look up clinical terminology mappings (ICD-10, SNOMED, MeSH) |
| onto_enrich | Add skos:exactMatch triples linking to clinical codes |
| onto_validate_clinical | Check class labels against clinical terminology |
| onto_align | Detect alignment candidates between two ontologies |
| onto_align_feedback | Accept/reject alignment candidates for self-calibrating weights |
| onto_lineage | View session lineage trail (plan, enforce, apply, monitor, drift) |
| onto_lint_feedback | Accept/dismiss lint issues to teach suppression |
| onto_enforce_feedback | Accept/dismiss enforce violations to teach suppression |
| onto_unload | Unload from memory. With name targets a specific cached entry; delete_cache=true also removes the on-disk file |
| onto_recompile | Re-parse the source. With name rebuilds a non-active cached entry without disturbing the active in-memory store |
| onto_cache_status | Inspect compile cache: active slot, all entries, effective [cache] config |
| onto_cache_list | List cached ontologies with metadata (is_active, in_memory, mtime, size) |
| onto_cache_remove | Remove a cached ontology by name (pass delete_file=false to keep the on-disk N-Triples) |
| onto_repo_list | List RDF/OWL files in configured [general] ontology_dirs directories |
| onto_repo_load | Load an ontology from a configured repo by bare name, relative path, or absolute path |
| onto_status | Server health / loaded triple count |
| onto_marketplace | Browse / install standard ontologies from the curated catalogue |
| onto_dl_check | Check subClass ⊑ superClass via DL tableaux |
| onto_dl_explain | Explain why a class is unsatisfiable (DL clash trace) |
| onto_embed | Generate text + Poincaré structural embeddings for all classes |
| onto_search | Natural-language query → most-similar classes |
| onto_similarity | Cosine + Poincaré distance between two IRIs |
| onto_import_schema | Introspect PostgreSQL or DuckDB schema → generate OWL classes/properties/cardinality |
| onto_sql_ingest | Run SQL SELECT against PostgreSQL or DuckDB → RDF (DuckDB enables federation over CSV/Parquet/JSON/HTTPFS/postgres-scanner via its extensions) |
Build me a pizza ontology with classes for Pizza, PizzaBase (ThinAndCrispy, DeepPan),
PizzaTopping (Mozzarella, Tomato, Pepperoni, Mushroom), and properties hasBase, hasTopping.
Include rdfs:labels and rdfs:comments on everything. Validate and run competency queries
to check I can ask "what toppings does a Margherita have?"
Load the ontology from https://www.w3.org/TR/owl-guide/wine.rdf, show me stats,
lint it, and run a SPARQL query to find all subclasses of Wine.
I need to add a new class "GlutenFreePizza" as a subclass of Pizza with a restriction
that hasBase only GlutenFreeBase. Plan the change, enforce against generic rules,
and apply in safe mode.
I have a CSV of employees with columns: name, department, role, start_date.
Map it to the loaded HR ontology and ingest it. Then validate with SHACL shapes
and run inference to materialize department hierarchies.
Load schema.org and my company ontology. Run onto_align to find equivalentClass
and exactMatch candidates. I'll review and give feedback to calibrate the weights.
Dynamically decide the next tool call based on what the previous tool returned. If onto_validate fails, fix and retry. If onto_stats shows wrong counts, regenerate. If onto_lint finds missing labels, add them. The MCP tools are individual operations -- Claude is the orchestrator.
From source (Rust 1.85+):
git clone https://github.com/fabio-rovai/open-ontologies.git
cd open-ontologies && cargo build --release
./target/release/open-ontologies init
Add to ~/.claude/settings.json:
{
"mcpServers": {
"open-ontologies": {
"command": "/path/to/open-ontologies/target/release/open-ontologies",
"args": ["serve"]
}
}
}
Restart Claude Code. The onto_* tools are now available.
Add to ~/Library/Application Support/Claude/claude_desktop_config.json:
{
"mcpServers": {
"open-ontologies": {
"command": "/path/to/open-ontologies/target/release/open-ontologies",
"args": ["serve"]
}
}
}
Add to .cursor/mcp.json or equivalent:
{
"mcpServers": {
"open-ontologies": {
"command": "/path/to/open-ontologies/target/release/open-ontologies",
"args": ["serve"]
}
}
}
{
"mcpServers": {
"open-ontologies": {
"command": "docker",
"args": ["run", "-i", "--rm", "ghcr.io/fabio-rovai/open-ontologies", "serve"]
}
}
}
Build me a Pizza ontology following the Manchester University tutorial.
Include all 49 toppings, 24 named pizzas, spiciness value partition,
and defined classes (VegetarianPizza, MeatyPizza, SpicyPizza).
Validate it, load it, and show me the stats.
Claude generates Turtle, then runs the full pipeline automatically:
onto_validate → onto_load → onto_stats → onto_reason → onto_stats → onto_lint → onto_enforce → onto_query → onto_save → onto_version
Every build includes OWL reasoning (materializes inferred triples), design pattern enforcement, and automatic versioning.
The Studio is a native desktop application that wraps the same engine in a visual environment — no browser, no server to manage. It runs entirely on your machine: the engine sidecar handles RDF/OWL operations while the UI renders the graph in real time.
Think of it as Protege meets an AI copilot. Type "build ontology about cats" and watch a 1,400-class ontology appear in the tree — classes, properties, individuals, and axioms built automatically across 13 pipeline steps. Click any node to inspect its triples, trace connections via clickable pills, and follow every change through the lineage panel.
Prior to v0.1.12, the Studio used a D3.js horizontal tree and a 3D force-directed graph (Three.js / WebGL). Both worked for small ontologies (~100 classes) but became unusable at IES-level depth: the D3 tree couldn't handle 500+ nodes without layout thrashing, and the 3D graph froze the WebKit webview above 1,000 nodes.
The v2 deep builder changed the equation — a single /build command now produces 1,400+ classes. We replaced both views with a virtualized DOM tree: only visible rows exist in the DOM (constant memory regardless of ontology size), with hierarchy connector lines, type-filtered legend, search, breadcrumb navigation, and a connections panel. This handles the full IES Common (511 classes) and deep-built ontologies (1,400+ classes) without lag.
The Studio launches three processes that communicate locally:
localhost:8080When you type in the chat panel, your message goes to the Agent sidecar, which sends it to Claude. Claude decides which onto_* tools to call, the engine executes them, and the UI refreshes the graph. The entire loop — prompt to visual update — takes seconds.
Prerequisites: Rust + Cargo · Node.js 18+
# 1. Build the engine binary (from repo root)
cargo build --release
# 2. Install JS dependencies
cd studio && npm install
# 3. Run
PATH=/opt/homebrew/bin:~/.cargo/bin:$PATH npm run tauri dev
The first launch compiles the Tauri shell (~2 min). Subsequent launches start in seconds.
| Feature | Description |
| --- | --- |
| Virtualized Tree | Ontology explorer that handles 1,500+ classes without lag. Hierarchy connector lines, collapsible branches, type-filtered legend (Class/Property/Individual), search with auto-expand, breadcrumb path navigation, and a connections panel showing domain/range relationships as clickable pills. Only visible rows are in the DOM — constant memory regardless of ontology size. |
| AI Agent Chat | Natural language ontology engineering via Claude Opus 4.6 + Agent SDK. Two build modes: /build runs a 13-step pipeline producing IES-level ontologies (500-1,500+ classes, 100-200+ properties), /sketch runs 3 steps for quick prototyping (~80 classes). Each tool call is shown in real time. |
| Property Inspector | Protege-style inline triple editor. Click any node to see its rdfs:subClassOf, rdfs:label, rdfs:domain, rdfs:range and all other triples. Edit in place, hover to delete, + Add for new triples. Changes are immediately reflected in the graph. |
| Lineage Panel | Full audit trail from SQLite: every plan, apply, enforce, drift, monitor, and align event, grouped by session with timestamps. See exactly what Claude did and in what order. |
| Named Save | ⌘S to save as ~/.open-ontologies/<name>.ttl. Auto-saves to studio-live.ttl after every mutation so you never lose work. |
| Shortcut | Action |
| --- | --- |
| ⌘J | Toggle AI chat panel |
| ⌘I | Toggle property inspector |
| ⌘S | Save ontology |
| F | Fit graph to viewport (tree view) |
| R | Reset zoom (tree view) |
| Esc | Deselect node |
| Shift+click | Collapse/expand branch (tree view) |
| Scroll | Zoom in/out |
| Click + drag | Pan |
OntoAxiom tests axiom identification across 9 ontologies and 3,042 ground truth axioms.
| Approach | F1 | vs o1 (paper best) | | --- | --- | --- | | o1 (paper's best) | 0.197 | — | | Bare Claude Opus | 0.431 | +119% | | MCP extraction | 0.717 | +264% |