kubectl-ai is an AI powered kubernetes agent that runs in your terminal.
First, ensure that kubectl is installed and configured.
-
Download the latest release from the releases page for your target machine.
-
Untar the binary, make it executable and move it to a directory included in your $PATH (as shown below).
$ tar -zxvf kubectl-ai_Darwin_arm64.tar.gz
$ chmod a+x kubectl-ai
$ sudo mv kubectl-ai /usr/local/bin/
Set your Gemini API key as an environment variable. If you don't have a key, get one from Google AI Studio.
export GEMINI_API_KEY=your_api_key_here
Run interactively:
kubectl-ai
The interactive mode allows you to have a chat with kubectl-ai
, asking multiple questions in sequence while maintaining context from previous interactions. Simply type your queries and press Enter to receive responses. To exit the interactive shell, type exit
or press Ctrl+C.
Or, run with a task as input:
kubectl-ai -quiet "fetch logs for nginx app in hello namespace"
Combine it with other unix commands:
kubectl-ai < query.txt
# OR
echo "list pods in the default namespace" | kubectl-ai
You can even combine a positional argument with stdin input. The positional argument will be used as a prefix to the stdin content:
cat error.log | kubectl-ai "explain the error"
You can use the following special keywords for specific actions:
model
: Display the currently selected model.models
: List all available models.version
: Display thekubectl-ai
version.reset
: Clear the conversational context.clear
: Clear the terminal screen.exit
orquit
: Terminate the interactive shell (Ctrl+C also works).
Use it via the kubectl
plug interface like this: kubectl ai
. kubectl will find kubectl-ai
as long as it's in your PATH. For more information about plugins please see: https://kubernetes.io/docs/tasks/extend-kubectl/kubectl-plugins/
# Get information about pods in the default namespace
kubectl-ai -quiet "show me all pods in the default namespace"
# Create a new deployment
kubectl-ai -quiet "create a deployment named nginx with 3 replicas using the nginx:latest image"
# Troubleshoot issues
kubectl-ai -quiet "double the capacity for the nginx app"
The kubectl-ai
will process your query, execute the appropriate kubectl commands, and provide you with the results and explanations.
kubectl-ai project includes k8s-bench - a benchmark to evaluate performance of different LLM models on kubernetes related tasks. Here is a summary from our last run:
Model | Success | Fail |
---|---|---|
gemini-2.5-flash-preview-04-17 | 10 | 0 |
gemini-2.5-pro-preview-03-25 | 10 | 0 |
gemma-3-27b-it | 8 | 2 |
Total | 28 | 2 |
See full report for more details.
Note: This is not an officially supported Google product. This project is not eligible for the Google Open Source Software Vulnerability Rewards Program.