by stakpak
Ship your code, on autopilot. An open source agent that lives on your machines 24/7 and keeps your apps running. 🦀
# Add to your Claude Code skills
git clone https://github.com/stakpak/agent:star: Help us reach more developers and grow the Stakpak community. Star this repo!

curl -sSL https://stakpak.dev/install.sh | sh # install Stakpak
stakpak init # understand your apps and tech stack
stakpak autopilot up # start the autonomous agent, running 24/7 in the background
For more installation options...
You can't trust most AI agents with your DevOps. One mistake, and your production is toast. Stakpak is built different:
Generate infrastructure code, debug Kubernetes, configure CI/CD, automate deployments, without giving an LLM the keys to production.
Use the new lifecycle aliases for one-command setup/start/stop:
stakpak up # alias for: stakpak autopilot up
stakpak down # alias for: stakpak autopilot down
You can also use the canonical subcommands:
stakpak autopilot up
stakpak autopilot status
stakpak autopilot logs
stakpak autopilot down
stakpak autopilot doctor
Before running autopilot on a remote VM:
No comments yet. Be the first to share your thoughts!
stakpak up now runs preflight checks before startup, and stakpak autopilot doctor can be used as a deployment-readiness check before first boot:
stakpak autopilot doctor
stakpak up
See also: cli/README.md
~/.stakpak/config.toml: profile behavior (model, allowed_tools, auto_approve, system_prompt, max_turns, provider credentials)~/.stakpak/autopilot.toml: runtime wiring (schedules, channels, service/server settings)Use profile = "name" on schedules/channels and keep behavior inside profile definitions.
# schedule profile
stakpak autopilot schedule add health --cron '*/5 * * * *' --prompt 'Check health' --profile monitoring
# channel profile
stakpak autopilot channel add slack --bot-token "$SLACK_BOT_TOKEN" --app-token "$SLACK_APP_TOKEN" --profile ops
Full setup guide: cli/README.md
--enable-subagents flag)brew tap stakpak/stakpak
brew install stakpak
To update it you can use
brew update
brew upgrade stakpak
Download the latest binary for your platform from our GitHub Releases.
This image includes the most popular CLI tools the agent might need for everyday DevOps tasks like docker, kubectl, aws cli, gcloud, azure cli, and more.
docker pull ghcr.io/stakpak/agent:latest
You can use your own Anthropic or OpenAI API keys, custom OpenAI compatible endpoint, or a Stakpak API key.
Just run stakpak and follow the instructions which will create a new API key for you.
stakpak
Brave users may encounter issues with automatic redirects to localhost ports during the API key creation flow. If this happens to you:
Copy your new key from the browser paste it in your terminal
stakpak auth login --api-key $STAKPAK_API_KEY
export STAKPAK_API_KEY=<mykey>
stakpak account
# Anthropic
stakpak auth login --provider anthropic --api-key $ANTHROPIC_API_KEY
# OpenAI
stakpak auth login --provider openai --api-key $OPENAI_API_KEY
# Gemini
stakpak auth login --provider gemini --api-key $GEMINI_API_KEY
Create ~/.stakpak/config.toml with one of these configurations:
Option 1: Bring Your Own Keys (BYOK) - Use your Anthropic/OpenAI API keys:
[profiles.byok]
provider = "local"
# Unified model preference field
model = "anthropic/claude-sonnet-4-5"
# Built-in providers - credentials can also be set via environment variables
# (ANTHROPIC_API_KEY, OPENAI_API_KEY, GEMINI_API_KEY)
[profiles.byok.providers.anthropic]
type = "anthropic"
api_key = "sk-ant-..."
[profiles.byok.providers.openai]
type = "openai"
api_key = "sk-..."
[profiles.byok.providers.gemini]
type = "gemini"
api_key = "..."
[settings]
Option 2: Bring Your Own LLM - Use a local OpenAI-compatible endpoint (e.g. Ollama, LM Studio):
[profiles.offline]
provider = "local"
# Custom provider models use the format: provider_key/model_name
model = "offline/qwen/qwen3-coder-30b"
# The provider key "offline" becomes the model prefix
[profiles.offline.providers.offline]
type = "custom"
api_endpoint = "http://localhost:11434/v1"
# api_key is optional for local providers
[settings]
Option 3: Mix Built-in and Custom Providers:
[profiles.hybrid]
provider = "local"
# Unified model field (provider-prefixed)
model = "anthropic/claude-sonnet-4-5"
[profiles.hybrid.providers.anthropic]
type = "anthropic"
# Uses ANTHROPIC_API_KEY env var
[profiles.hybrid.providers.offline]
type = "custom"
api_endpoint = "http://localhost:11434/v1"
[settings]
Then run with your profile:
stakpak --profile byok
# or
stakpak --profile offline
# or
stakpak --profile hybrid
# Open the TUI
stakpak
# Resume execution from a checkpoint
stakpak -c <checkpoint-id>
docker run -it --entrypoint stakpak ghcr.io/stakpak/agent:latest
# for containerization tasks (you need to mount the Docker socket)
docker run -it \
-v "/var/run/docker.sock":"/var/run/docker.sock" \
-v "{your app path}":"/agent/" \
--entrypoint stakpak ghcr.io/stakpak/agent:latest
You can use Stakpak as a secure MCP proxy or expose its security-hardened tools through an MCP server.
--tool-mode local) - File operations and command execution only (no API key required)