by bgdnvk
autonomous systems engineering cli agent for any cloud environment: AWS, GCP, Cloudflare, etc
# Add to your Claude Code skills
git clone https://github.com/bgdnvk/clankerAlpha version.
Main agent powering Clanker Cloud
Beta coming soon clankercloud.ai
Docs available at docs.clankercloud.ai
Interactive docs: Clanker Cloud: How It Works
Courtesy of @cto_junior
Interactive getting started: Clanker Getting Started
Courtesy of @cto_junior
Ask questions about your infra (and optionally GitHub/etc). Clanker can inspect existing environments and also generate or apply infrastructure and deploy plans through its maker and deploy flows.
Repo: bgdnvk/clanker
Homebrew tap: clankercloud/homebrew-tap
brew tap clankercloud/tap
brew install clanker
make install
--no-cli-pager)brew install awscli
Copy the example config and edit it for your environments/providers:
cp .clanker.example.yaml ~/.clanker.yaml
alternatively you can do
clanker config init
Most providers use env vars for keys (see ), e.g.:
No comments yet. Be the first to share your thoughts!
export OPENAI_API_KEY="..."
export GEMINI_API_KEY="..."
export COHERE_API_KEY="..."
If you run without ~/.clanker.yaml:
openai (unless you pass --ai-profile).--openai-key → OPENAI_API_KEY (also supports ai.providers.openai.api_key and ai.providers.openai.api_key_env if config exists).--ai-profile gemini-api): --gemini-key → GEMINI_API_KEY (also supports ai.providers.gemini-api.api_key and ai.providers.gemini-api.api_key_env if config exists).--ai-profile cohere): --cohere-key → COHERE_API_KEY (also supports ai.providers.cohere.api_key and ai.providers.cohere.api_key_env if config exists).openai defaults to gpt-5; gemini/gemini-api defaults to gemini-3-pro-preview; cohere defaults to command-a-03-2025.Clanker uses your local AWS CLI profiles (not raw access keys in the clanker config).
Create a profile:
aws configure --profile clankercloud-tekbog | cat
aws sts get-caller-identity --profile clankercloud-tekbog | cat
Set the default environment + profile in ~/.clanker.yaml:
infra:
default_provider: aws
default_environment: clankercloud
aws:
environments:
clankercloud:
profile: clankercloud-tekbog
region: us-east-1
Override for a single command:
clanker ask --aws --profile clankercloud-tekbog "what lambdas do we have?" | cat
Clanker also exposes its own MCP surface as a CLI command.
Run it over HTTP:
clanker mcp --transport http --listen 127.0.0.1:39393 | cat
Or over stdio for MCP clients that launch commands directly:
clanker mcp --transport stdio | cat
The CLI MCP currently exposes tools to:
clanker commands through MCP, including ask, openclaw, and other subcommandsClanker chat routing also recognizes Clanker Cloud app questions now. If you use clanker talk and ask about the running desktop app or its saved settings, it will try the local Clanker Cloud backend first and fall back to Hermes if the app is not running.
Examples:
clanker ask --route-only "use clanker cloud mcp to show my saved settings" | cat
clanker ask --route-only "ask clanker cloud about the running app backend" | cat
clanker mcp --transport http --listen 127.0.0.1:39393 | cat
Example MCP calls against the standalone Clanker CLI server:
# Start the HTTP MCP server
clanker mcp --transport http --listen 127.0.0.1:39393 | cat
# Initialize a client session
curl -sS -X POST http://127.0.0.1:39393/mcp \
-H 'Content-Type: application/json' \
-H 'Accept: application/json, text/event-stream' \
--data '{"jsonrpc":"2.0","id":1,"method":"initialize","params":{"protocolVersion":"2025-03-26","capabilities":{},"clientInfo":{"name":"local-cli","version":"1.0"}}}' | jq
# List available CLI MCP tools
curl -sS -X POST http://127.0.0.1:39393/mcp \
-H 'Content-Type: application/json' \
-H 'Accept: application/json, text/event-stream' \
--data '{"jsonrpc":"2.0","id":2,"method":"tools/list","params":{}}' | jq
# Return the installed clanker version
curl -sS -X POST http://127.0.0.1:39393/mcp \
-H 'Content-Type: application/json' \
-H 'Accept: application/json, text/event-stream' \
--data '{"jsonrpc":"2.0","id":3,"method":"tools/call","params":{"name":"clanker_version","arguments":{}}}' | jq
# Return the internal route decision for a prompt
curl -sS -X POST http://127.0.0.1:39393/mcp \
-H 'Content-Type: application/json' \
-H 'Accept: application/json, text/event-stream' \
--data '{"jsonrpc":"2.0","id":4,"method":"tools/call","params":{"name":"clanker_route_question","arguments":{"question":"use clanker cloud mcp to show my saved settings"}}}' | jq
# Run a real clanker command through MCP
curl -sS -X POST http://127.0.0.1:39393/mcp \
-H 'Content-Type: application/json' \
-H 'Accept: application/json, text/event-stream' \
--data '{"jsonrpc":"2.0","id":5,"method":"tools/call","params":{"name":"clanker_run_command","arguments":{"args":["ask","--route-only","use clanker cloud mcp to show my saved settings"]}}}' | jq
The standalone CLI MCP currently exposes these tools:
clanker_versionclanker_route_questionclanker_run_commandFlags:
--aws: force AWS context/tooling for the question (uses the default env/profile from ~/.clanker.yaml unless you pass --profile)--profile <name>: override the AWS CLI profile for this run--ai-profile <name>: select an AI provider profile from ai.providers.<name> (overrides ai.default_provider)--maker: generate an AWS CLI plan (JSON) for infrastructure changes--destroyer: allow destructive AWS CLI operations when using --maker--apply: apply an approved maker plan (reads from stdin unless --plan-file is provided)--plan-file <path>: optional path to maker plan JSON file for --apply--debug: print diagnostics (selected tools, AWS CLI calls, prompt sizes)--agent-trace: print detailed coordinator/agent lifecycle logs (tool selection + investigation steps)clanker ask "what's the status of my chat service lambda?"
clanker ask --profile dev "what's the last error from my big-api-service lambda?"
clanker ask --ai-profile openai "What are the latest logs for our dev Lambda functions?"
clanker ask --ai-profile cohere --cohere-model command-a-03-2025 "Summarize the current deployment risks in dev."
clanker ask --agent-trace --profile dev "how can i create an additional lambda and link it to dev?"
# Maker (plan + apply)
# Generate a plan (prints JSON)
clanker ask --aws --maker "create a small ec2 instance and a postgres rds" | cat
# Apply an approved plan from stdin
clanker ask --aws --maker --apply < plan.json | cat
# Apply an approved plan from a file
clanker ask --aws --maker --apply --plan-file plan.json | cat
# Allow destructive operations (only with explicit intent)
clanker ask --aws --maker --destroyer "delete the clanka-postgres rds instance" | cat
When you run with --maker --apply, the runner tries to be safe and repeatable:
Clanker provides comprehensive Kubernetes cluster management and monitoring capabilities.
# Create an EKS cluster
clanker k8s create eks my-cluster --nodes 2 --node-type t3.small
clanker k8s create eks my-cluster --plan # Show plan only
# Create a kubeadm cluster on EC2
clanker k8s create kubeadm my-cluster --workers 2 --key-pair my-key
clanker k8s create kubeadm my-cluster --plan # Show plan only
# List clusters
clanker k8s list eks
clanker k8s list kubeadm
# Delete a cluster
clanker k8s delete eks my-cluster
clanker k8s delete kubeadm my-cluster
# Get kubeconfig for a cluster
clanker k8s kubeconfig eks my-cluster
clanker k8s kubeconfig kubeadm my-cluster
# Deploy a container image
clanker k8s deploy nginx --name my-nginx --port 80
clanker k8s deploy nginx --replicas 3 --namespace production
clanker k8s deploy nginx --plan # Show plan only
# Get all resources from a specific cluster (JSON output)
clanker k8s resources --cluster my-cluster
# Get resources in YAML format
clanker k8s resources --cluster my-cluster -o yaml
# Get resources from all EKS clusters
clanker k8s resources
# Get logs from a pod
clanker k8s logs my-pod
# Get logs from a specific container
clanker k8s logs my-pod -c my-container
# Follow logs in real-time
clanker k8s logs my-pod -f
# Get last N lines
clanker k8s logs my-pod --tail 100
# Get logs from a specific time period
clanker k8s logs my-pod --since 1h
# Get logs with timestamps
clanker k8s logs my-