Load Balancing Using Nginx for Scalable Web Application Deployment

Ashique AntonyAshique Antony
2 min read

Load balancing is crucial when scaling applications to handle more traffic. In my recent project, I configured Nginx as a reverse proxy load balancer to distribute traffic across multiple backend servers, improving performance, fault tolerance, and scalability.

Architecture Overview

  • Load Balancer (Nginx) – Acts as the front-facing server.

  • Backend Servers Server 2, server 3) – Serve actual content (e.g., a welcome page or web application).

  • Round Robin Method – Distributes requests sequentially across servers.

  • (There are actually different types of architecture based on load balancing using nginx, based on requirements, we select)

For the project to know how it works, let’s install any webserver technologies here i use:-

  1. server 2 = httpd

yum install httpd -y default port 80

  1. server 3 = tomcat

yum install tomcat -y default port 8080

So we need to do a reverse proxy here to receive Tomcat on the server IP

*install nginx yum install nginx -y

*In nginx conf file add

Sample Config: /etc/nginx/conf.d/tomcat.conf

server {
    listen 80;
    server_name your-domain.com;  # or use IP if no domain

    location / {
        proxy_pass http://localhost:8080;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
    }
}

Reload Nginx

sudo nginx -t       # test config
sudo systemctl reload nginx

Let’s do tha load balancing with nginx

In the Load Balancer (Nginx) – Acts as the front-facing server

Nginx Configuration Example

 {
    upstream backend {
        server 192.168.1.10;
        server 192.168.1.11;
        server 192.168.1.12;
    }

    server {
        listen 80;
        location / {
            proxy_pass http://backend;
        }
    }
}

Restart Nginx:

sudo systemctl restart nginx

OUTPUT

visit the server <IP>

With every refresh, the load balancer distributes the traffic to another server

0
Subscribe to my newsletter

Read articles from Ashique Antony directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Ashique Antony
Ashique Antony

Passionate DevOps Engineer eager to drive innovation by automating workflows, optimizing cloud infrastructure, and enhancing CI/CD pipelines. Seeking an opportunity to collaborate with forward-thinking teams to streamline development and deployment processes for maximum efficiency and scalability.scalability.