Skip to main content
SkillsLLM
Categories
Blog
AI News
About
Home
Compare
claude-code-local vs llama.cpp
claude-code-local vs llama.cpp
Side-by-side AI skills comparison
claude-code-local
nicedreamzapp
llama.cpp
ggml-org
Stars
2,052
103,839
Forks
395
16,880
Language
Python
C++
Category
AI Agents
AI Agents
Security
Pending
Verified
SKILL.md
Votes
0
0
Bookmarks
0
0
Topics
abliterated
ai-privacy
airgap
ambient-computing
anthropic
apple-silicon
browser-agent
claude-code
gemma
llama
local-ai
local-llm
macos
mlx
mlx-lm
offline-ai
on-device-ai
private-ai
qwen
voice-ai
ggml
Description
Run Claude Code 100% on-device with local AI on Apple Silicon. MLX-native Anthropic-API server, 65 tok/s Qwen 3.5 122B, Llama 3.3 70B, Gemma 4 31B. Private, offline, airgap-ready. Built for NDA / legal / healthcare workflows.
LLM inference in C/C++
View claude-code-local
View llama.cpp