- Explore MCP Servers
- talk2-k8s
Talk2 K8s
What is Talk2 K8s
talk2-k8s is an AI-powered interface that allows users to interact with Kubernetes clusters using natural language, leveraging the capabilities of Large Language Models (LLM) through a Kubernetes MCP Server.
Use cases
Use cases for talk2-k8s include simplifying Kubernetes operations for developers, automating deployment processes, and enhancing user experience in managing cloud resources.
How to use
To use talk2-k8s, set up a Kubernetes cluster on Google Cloud, install the necessary tools like GCloud CLI and kubectl, create a Gemini API key, and run the kubectl-ai command in your terminal to start interacting with your cluster.
Key features
Key features of talk2-k8s include natural language processing for Kubernetes commands, integration with Gemini AI models, and the ability to validate user interactions through challenges.
Where to use
talk2-k8s can be used in cloud computing environments, DevOps practices, and any scenario where Kubernetes management and automation are required.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Overview
What is Talk2 K8s
talk2-k8s is an AI-powered interface that allows users to interact with Kubernetes clusters using natural language, leveraging the capabilities of Large Language Models (LLM) through a Kubernetes MCP Server.
Use cases
Use cases for talk2-k8s include simplifying Kubernetes operations for developers, automating deployment processes, and enhancing user experience in managing cloud resources.
How to use
To use talk2-k8s, set up a Kubernetes cluster on Google Cloud, install the necessary tools like GCloud CLI and kubectl, create a Gemini API key, and run the kubectl-ai command in your terminal to start interacting with your cluster.
Key features
Key features of talk2-k8s include natural language processing for Kubernetes commands, integration with Gemini AI models, and the ability to validate user interactions through challenges.
Where to use
talk2-k8s can be used in cloud computing environments, DevOps practices, and any scenario where Kubernetes management and automation are required.
Clients Supporting MCP
The following are the main client software that supports the Model Context Protocol. Click the link to visit the official website for more information.
Content
Build with AI with Gemini: Let’s talk to your cluster - LLM Agents powered Kubernetes MCP Server
Resources
Pre Requisites
- [GCloud CLI] (https://cloud.google.com/sdk/docs/install)
- Install kubectl
- Kubernetes cluster running
- Install kubectl-ai
Setup
- Create Kubernetes cluster on Google Cloud
- Create a Gemini API key from (https://aistudio.google.com/apikey)
- Export the API key as an environment variable
export GEMINI_API_KEY=<API_KEY>
- Open a terminal and run the following command to start the kubectl-ai
kubectl-ai --model gemini-2.5-pro-exp-03-25
or
kubectl-ai --model gemini-2.5-flash-preview-04-17
Additional Tips
Add a Kubernetes MCP server (Optional)
code --add-mcp '{"name":"kubernetes","command":"npx","args":["kubernetes-mcp-server@latest"]}'
Talk to Cluster Challenge
- Apply the following command to get started
kubectl apply -f https://raw.githubusercontent.com/chamodshehanka/talk2-k8s/refs/heads/main/bwai-manifests.yaml
- Complete the challenge by using the kubectl-ai
- Validate the challenge by running the following command
chmod +x ./validate.sh
./validate.sh
Submission Link: https://forms.gle/FdQaVMad1MszTzkM8
Dev Tools Supporting MCP
The following are the main code editors that support the Model Context Protocol. Click the link to visit the official website for more information.










