mcp-registry/kubectl-mcp-server

    ==================
      
       /// MCP ///
      /// KUB ///
        
    ==================
        
    [server:online]
    [protocol:ready]

kubectl-mcp-server

by rohitg00

A Model Context Protocol server that lets AI assistants (Claude, Cursor, Windsurf, etc.) issue natural-language Kubernetes commands through kubectl/Helm.

667
118
Open Source

Installation

Prerequisites
1. Kubernetes cluster access and a valid KUBECONFIG file
2. Python 3.9+
3. kubectl installed and pointing at the desired cluster
4. An LLM key (e.g. OPENAI_API_KEY or ANTHROPIC_API_KEY) that the server can call
Quick-start
# 1. Clone $ git clone https://github.com/rohitg00/kubectl-mcp-server.git $ cd kubectl-mcp-server # 2. Create a virtual-env (recommended) $ python -m venv .venv && source .venv/bin/activate # 3. Install Python dependencies $ pip install -r requirements.txt # 4. Export credentials and cluster context $ export KUBECONFIG=$HOME/.kube/config # or path to your config $ export OPENAI_API_KEY=<your-key> # or ANTHROPIC_API_KEY, etc. # 5. Start the MCP server $ python -m kubectl_mcp_server --host 0.0.0.0 --port 8080 # 6. (Optional) Register as a kubectl plugin $ ln -s $(pwd)/scripts/kubectl-ai /usr/local/bin/kubectl-ai $ chmod +x /usr/local/bin/kubectl-ai # 7. Chat with your cluster! $ kubectl ai "Why is my deployment restarting?"
Docker deployment
$ docker build -t kubectl-mcp-server . $ docker run -d -p 8080:8080 \ -e KUBECONFIG=/kube/config \ -e OPENAI_API_KEY=$OPENAI_API_KEY \ -v $HOME/.kube/config:/kube/config kubectl-mcp-server
Helm chart and Kubernetes manifests are in `deploy/` for in-cluster deployment (edit `values.yaml` with your LLM key).

Documentation

License: MIT License
Updated 7/30/2025

Table of Contents