by AMAP-ML
Let Skills Evolve Collectively with Agentic Evolver
# Add to your Claude Code skills
git clone https://github.com/AMAP-ML/SkillClawNo comments yet. Be the first to share your thoughts!
N users, one Skill, continuous evolution. Every conversation compounds. Every user contributes.
SkillClaw makes LLM agents progressively better by evolving reusable skills from real session data and sharing them across a group of agents.
The system has two components:
Client Proxy — A local API proxy (/v1/chat/completions, /v1/messages) that intercepts agent requests, records session artifacts, and syncs skills with shared storage.
Evolve Server (evolve_server) — A single evolve service that reads session data from shared storage, evolves or creates skills, and writes them back. It supports two engines:
workflow: fixed 3-stage LLM pipeline (Summarize → Aggregate → Execute)agent: OpenClaw-driven agent workspace with direct skill editingBoth components share the same storage layer (Alibaba OSS / S3 / local filesystem) and skill format (SKILL.md).
Think of SkillClaw as "per-user client, per-group evolver":
skillclaw client proxy on their own machine.skillclaw-evolve-server.local, oss, or s3).This separation is the key beginner mental model:
If this is your first time, start with Path A. It proves the client-side install and usage first, without mixing in shared deployment concerns.
openclaw only if you intentionally choose the openclaw CLI integration or the server agent engineThe beginner path below is locally smoke-tested on macOS.
git clone.macOS / Linux:
git clone https://github.com/AMAP-ML/SkillClaw.git && cd SkillClaw
bash scripts/install_skillclaw.sh
source .venv/bin/activate
Windows PowerShell (manual install because the repository does not currently ship a native .ps1 installer):
git clone https://github.com/AMAP-ML/SkillClaw.git
Set-Location SkillClaw
python -m venv .venv
.\.venv\Scripts\Activate.ps1
python -m pip install -U pip
python -m pip install -e ".[evolve,sharing,server]"
skillclaw setup
The setup wizard prompts for the provider, model, local skills directory, PRM settings, optional CLI agent integration, and optional shared storage.
For a minimal first run:
none for the CLI agent if you do not want SkillClaw to auto-configure an external agent yet~/.skillclaw/skills~/.skillclaw/local-shareskillclaw start --daemon
skillclaw status
PROXY_PORT="$(skillclaw config proxy.port | awk '{print $2}')"
curl "http://127.0.0.1:${PROXY_PORT}/healthz"
The default proxy port is 30000, but the health check should follow your configured proxy.port. Use skillclaw config show to inspect the active upstream model, proxy port, and sharing target.
At this point SkillClaw is already usable as a single-user local proxy. You do not need to run an evolve server just to use the client.
If you later want automatic skill evolution for yourself, keep the same client install and continue with Server Guide.
If you already use Hermes, the client-side path is:
skillclaw setup and choose hermes for CLI agent to configure.Proxy model name exposed to agents as skillclaw-model unless you have a specific reason to change it.~/.hermes/config.yaml to point Hermes at the local proxy.Minimal verification:
skillclaw start --daemon
hermes chat -Q -m skillclaw-model -q "Reply with exactly HERMES_SKILLCLAW_OK and nothing else."
Install the same client as in Path A, then point your local client at the group's shared storage. The easiest beginner route is to rerun skillclaw setup, enable shared storage, and fill in the values your server operator gives you.
You can also set the keys directly. Example for OSS:
skillclaw config sharing.enabled true
skillclaw config sharing.backend oss
skillclaw config sharing.endpoint https://oss-cn-hangzhou.aliyuncs.com
skillclaw config sharing.bucket my-skillclaw-bucke