by ybuild-ai
Agent skill for turning AI images and videos into playable game art assets
# Add to your Claude Code skills
git clone https://github.com/ybuild-ai/ai-game-art-pipeline-skillGuides for using ai agents skills like ai-game-art-pipeline-skill.
Use this skill when a user wants to generate, plan, debug, or package game art that must run in an actual game runtime.
The skill is provider-neutral. It does not assume any specific API key or vendor SDK. If a model call is needed, use the user's available provider through an adapter or ask them to implement scripts/provider_stub.py.
Pick the pipeline by runtime job, not by model hype.
| Runtime job | Pipeline | |---|---| | Static props, items, FX icons | Image model + style refs + removable background | | Existing/brand character | Reuse canonical sprite sheets before regenerating | | Combat character animation | Keyframe-first + 2D grid + post-processing + curation | | Walk/run/body mechanics | Prefer video motion reference; use 3D skeleton only when cleanup is acceptable | | Ambient loops | Video model -> extracted frames | | Cinematic ultimate / boss intro | Full-screen video-to-frames + code-driven hit logic | | Runtime feel | Deterministic code: hit pause, shake, particles, trails, SFX |
Read only the relevant file:
references/static-assets.md — props, icons, items, simple monsters, canonical asset reuse.references/battle-sprites.md — keyframe-first combat sprite sheets, 2D grids, anchors, contact sheets.references/motion-video.md — 3D skeleton pitfalls, video motion references, Veo/Seedance-style video-to-frames, cinematic ultimates.Provider-neutral workflows for turning AI images and videos into playable game assets.
This skill helps agents plan, generate, clean, package, and QA game-runtime art assets: sprites, icons, backgrounds, video-to-frames animation, cinematic ultimates, anchors, compression, and runtime checks.
It is not a prompt pack or a model leaderboard. The point is to ship assets that survive a real game loop.
Most AI game art demos stop at a beautiful image. Games need stricter things:
The core rule:
Pick the pipeline by runtime job, not by model hype.
Recommended, if you use the Agent Skills installer:
npx skills add ybuild-ai/ai-game-art-pipeline-skill --skill ai-game-art-pipeline -g
No comments yet. Be the first to share your thoughts!
references/backgrounds.md — level backgrounds, master-first parallax, outpaint chains, mobile texture limits.references/runtime-shipping.md — formats, compression, CDN layout, audio/SFX, runtime QA.Scripts are local utilities only. They do not call model APIs.
scripts/provider_stub.py — adapter interface the user can implement for their provider.scripts/chroma_key_magenta.py — remove magenta background from generated sheets/icons.scripts/sheet_contact.py — make numbered contact sheets for curation.scripts/extract_video_frames.py — extract/resize video frames for runtime texture sequences.Do not write a full vendor SDK unless the user explicitly asks for one. Prefer a thin adapter around scripts/provider_stub.py so credentials, endpoints, retries, billing, and model choices stay in the user's project.
If the user wants examples, point them to examples/providers/README.md and adapt only the minimal file they need:
When using this skill, produce:
Manual Codex install:
git clone https://github.com/ybuild-ai/ai-game-art-pipeline-skill.git ~/.codex/skills/ai-game-art-pipeline
Manual Claude Code install:
git clone https://github.com/ybuild-ai/ai-game-art-pipeline-skill.git ~/.claude/skills/ai-game-art-pipeline
Then ask your agent for tasks like:
| Runtime job | Recommended route | |---|---| | Static props, items, FX icons | Image model + style refs + removable background | | Existing or brand character | Reuse canonical sprite sheets before regenerating | | Combat character animation | Keyframe-first + 2D grid + post-processing + curation | | Walk/run/body mechanics | Prefer video motion reference; use 3D skeleton only when cleanup is acceptable | | Ambient loops | Video model -> extracted frames | | Cinematic ultimate / boss intro | Full-screen video-to-frames + code-driven hit logic | | Runtime feel | Deterministic code: hit pause, shake, particles, trails, SFX |
| Combat sprites | Cinematic ultimate |
|---|---|
|
|
|
| Motion reference tradeoff | Pipeline map |
|---|---|
|
|
|
SKILL.md
references/
static-assets.md
battle-sprites.md
motion-video.md
backgrounds.md
runtime-shipping.md
scripts/
provider_stub.py
chroma_key_magenta.py
sheet_contact.py
extract_video_frames.py
examples/
prompts.md
batch_generation_example.py
providers/
README.md
minimal_provider_example.py
http_image_provider_example.py
http_video_provider_example.py
media/
pipeline-map.jpg
battle-sprite-grid.jpg
seedance-awakening-frames.jpg
skeleton-vs-video-motion.jpg
Scripts are local utilities only. They do not contain API keys, endpoints, private paths, or vendor SDK code.
| Script | Purpose |
|---|---|
| scripts/provider_stub.py | Adapter interface users can implement for their own image/video provider |
| scripts/chroma_key_magenta.py | Remove solid magenta backgrounds from generated assets |
| scripts/sheet_contact.py | Build numbered contact sheets for sprite curation |
| scripts/extract_video_frames.py | Extract and resize video frames for runtime texture sequences |
Examples:
python scripts/chroma_key_magenta.py input.png output.png
python scripts/sheet_contact.py frames/ contact.png --cols 6
python scripts/extract_video_frames.py ultimate.mp4 frames/ --fps 14 --start 0.6 --duration 3.6 --width 1280
The skill intentionally does not ship a real image/video SDK. Model APIs change quickly, and users should own credentials, endpoints, model names, retry policy, and billing behavior.
Instead, the repo includes thin adapter examples:
| Example | Use it when |
|---|---|
| examples/providers/minimal_provider_example.py | You want a fake provider to test the pipeline shape locally |
| examples/providers/http_image_provider_example.py | You have an image-generation HTTP proxy that returns a URL or base64 image |
| examples/providers/http_video_provider_example.py | You have an async video-generation HTTP proxy with job polling |
| examples/batch_generation_example.py | You want to turn a small art plan into provider requests |
Read examples/providers/README.md before wiring a real provider.
Good fit:
Bad fit:
Contributions are welcome. The most useful additions are:
See CONTRIBUTING.md.
MIT. See LICENSE.