Setting Up a Multi-Node Kubernetes Cluster with Kind: Step-by-Step Manual


Kubernetes (K8s) is the go-to orchestration tool for managing containerized applications at scale. For developers and learners, testing Kubernetes configurations locally is a key step in mastering it. With Kind (Kubernetes IN Docker), you can run Kubernetes clusters in Docker containers locally. This guide will show you how to set up a multi-node Kubernetes cluster using Kind.
Prerequisites
Before we begin, ensure you have the following tools installed on your machine:
Docker: Install Docker Desktop from https://www.docker.com/
kubectl: Kubernetes CLI to manage the cluster. Install it using:
On Windows (PowerShell):
choco install kubernetes-cli
On macOS:
brew install kubectl
On Linux:
sudo apt-get install -y kubectl
Kind: Install Kind using the following command:
go install sigs.k8s.io/kind@latest
Alternatively, on Windows (via Chocolatey):
choco install kind
Verify the installation:
kind version
Step 1: Create a Kind Configuration File
To create a multi-node cluster, we need to define a configuration file that specifies the roles and number of nodes. Here's an example configuration:
kind: Cluster
apiVersion: kind.x-k8s.io/v1alpha4
nodes:
- role: control-plane
- role: worker
- role: worker
Save this file as kind-cluster-config.yaml
.
Step 2: Create the Cluster
Run the following command to create a Kubernetes cluster with the configuration:
kind create cluster --config kind-cluster-config.yaml
This will:
Create a control-plane node.
Add two worker nodes to the cluster.
To check the cluster's status, use:
kubectl cluster-info
Step 3: Verify the Nodes
Once the cluster is up, verify the nodes by running:
kubectl get nodes
You should see:
One control-plane node.
Two worker nodes.
Example output:
NAME STATUS ROLES AGE VERSION
kind-control-plane Ready control-plane 2m v1.27.0
kind-worker Ready <none> 1m v1.27.0
kind-worker2 Ready <none> 1m v1.27.0
Step 4: Deploy a Sample Application
Let’s deploy a sample application to validate the cluster. For this example, we'll deploy an Nginx deployment:
- Create a deployment file named
nginx-deployment.yaml
:
apiVersion: apps/v1
kind: Deployment
metadata:
name: nginx-deployment
spec:
replicas: 3
selector:
matchLabels:
app: nginx
template:
metadata:
labels:
app: nginx
spec:
containers:
- name: nginx
image: nginx:latest
ports:
- containerPort: 80
- Apply the deployment:
kubectl apply -f nginx-deployment.yaml
- Check the pods:
kubectl get pods
You should see the pods running.
- Expose the deployment as a service:
kubectl expose deployment nginx-deployment --type=NodePort --port=80
- Get the service details:
kubectl get svc
Use the NodePort provided to access the Nginx application in your browser.
Step 5: Clean Up the Cluster
To delete the cluster after testing, run:
kind delete cluster
This removes all resources and stops the containers used by Kind.
Tips and Troubleshooting
Docker Issues:
Ensure Docker is running before creating the cluster.
Restart Docker if the cluster creation fails.
Cluster Not Accessible:
Check that the kubeconfig file is correctly set.
Use
kubectl config view
to debug kubeconfig issues.
Adding More Nodes:
- To add more worker nodes, modify the Kind configuration file to include additional
- role: worker
entries.
- To add more worker nodes, modify the Kind configuration file to include additional
Conclusion
Setting up a multi-node Kubernetes cluster locally with Kind is an excellent way to test and learn Kubernetes. It allows you to simulate real-world scenarios on your machine without needing a cloud environment.
Use this guide to experiment with different Kubernetes features, deploy applications, and gain hands-on experience in a controlled setup. Happy learning!
Subscribe to my newsletter
Read articles from Sanket Rakshe directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Sanket Rakshe
Sanket Rakshe
Backend Developer | Cloud Enthusiast | I am currently working as a Software engineer at CRIF Solutions. Passionate about creating scalable solutions and optimizing workflows. Sharing insights on cloud computing, DevOps, and software development to empower the developer community.