Monitoring Java App with the ELK Stack on AWS

Nishank KoulNishank Koul
7 min read

Introduction

Monitoring and managing logs in any modern software system is a vital task, especially when dealing with distributed microservices and real-time applications. One of the most powerful and widely adopted solutions for centralized logging and visualization is the ELK Stack, which comprises Elasticsearch, Logstash, and Kibana.

In this guide, we'll walk you through a complete, hands-on scenario of monitoring a Java application using the ELK Stack on AWS EC2 instances. We will also leverage Filebeat to ship logs from the application server to the ELK stack server.

🔍 What is the ELK Stack?

The ELK Stack is an open-source suite of tools developed by Elastic for log aggregation, analysis, and visualization. It enables users to gain real-time insights into their systems, troubleshoot issues quickly, and make data-driven decisions.

🧱 Components of the ELK Stack:

  1. Elasticsearch
    A highly scalable, distributed search and analytics engine that stores and indexes data.

  2. Logstash
    A powerful data collection and processing engine that ingests data from various sources, transforms it, and ships it to a destination like Elasticsearch.

  3. Kibana
    A web-based visualization tool for Elasticsearch. It provides dashboards, graphs, charts, and other tools for log analysis and monitoring.

🔄 How They Work Together:

  • Logstash collects and processes data from logs or other sources.

  • The data is indexed and stored in Elasticsearch.

  • Kibana connects to Elasticsearch and visualizes this data using beautiful, interactive dashboards.

🌟 Use Cases:

  • Real-time application monitoring

  • Centralized logging

  • Security analytics

  • Business intelligence

  • Infrastructure monitoring

🧪 Project Scenario: Monitoring a Java Application

Let’s walk through a practical scenario where we have a Java application running on a virtual machine (VM), and we want to monitor its logs using the ELK Stack hosted on another VM.

📁 Step 1: Setting Up the Java Application

Imagine you have a Java application that you run using:

java -jar app.jar

This command launches your application and logs information such as error messages, request traces, system events, etc. These logs, by default, are printed to the console in a raw, unstructured format, making it hard to interpret or analyze.

To make this more manageable:

nohup java -jar app.jar > app.log &
  • nohup: Keeps the process running in the background.

  • app.log: Stores the output logs.

📤 Step 2: Introducing Filebeat for Log Forwarding

To transfer the raw logs from the application VM to the ELK Stack VM, we use Filebeat, a lightweight log shipper by Elastic.

🔧 Filebeat Configuration:

Filebeat reads the app.log file and forwards the data to Logstash, which is listening on port 5044 on the ELK Stack VM.

This setup ensures the logs are securely and efficiently transferred for processing.

🏗️ Step 3: Setting Up the ELK Stack on a Separate VM

To build a robust monitoring pipeline, we deploy the ELK Stack on a separate virtual machine. In this guide, we'll use AWS EC2 instances for both the Java application and the ELK Stack.

🔓 Open Required Ports:

☁️ Installing ELK Stack Components

✅ 1. Install Java

Elasticsearch and Logstash require Java.

sudo apt update
sudo apt install openjdk-17-jre-headless -y

✅ 2. Install Elasticsearch

Add the Elastic APT repository and install Elasticsearch:

wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
echo "deb https://artifacts.elastic.co/packages/7.x/apt stable main" | sudo tee /etc/apt/sources.list.d/elastic-7.x.list
sudo apt update
sudo apt install elasticsearch -y

Configure Elasticsearch:

Edit the configuration file:

sudo vim /etc/elasticsearch/elasticsearch.yml

Add the following:

network.host: 0.0.0.0
cluster.name: my-cluster
node.name: node-1
discovery.type: single-node

Start and enable the service:

sudo systemctl start elasticsearch
sudo systemctl enable elasticsearch

Visit http://<ELK-VM-IP>:9200 to confirm Elasticsearch is running.

Elasticsearch is running successfully.

✅ 3. Install and Configure Logstash

sudo apt install logstash -y

Create Logstash Configuration:

sudo vim /etc/logstash/conf.d/logstash.conf

Paste:

input {
  beats {
    port => 5044
  }
}

filter {
  grok {
    match => { "message" => "%{TIMESTAMP_ISO8601:log_timestamp} %{LOGLEVEL:log_level} %{GREEDYDATA:log_message}" }
  }
}

output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    index => "logs-%{+YYYY.MM.dd}"
  }
  stdout { codec => rubydebug }
}

Start and enable Logstash:

sudo systemctl start logstash
sudo systemctl enable logstash

✅ 4. Install and Configure Kibana

sudo apt install kibana -y

Edit the configuration:

sudo vim /etc/kibana/kibana.yml

Add:

server.host: "0.0.0.0"
elasticsearch.hosts: ["http://localhost:9200"]

Start and enable Kibana:

sudo systemctl start kibana
sudo systemctl enable kibana

Access the UI at: http://<ELK-VM-IP>:5601

🛠️ Setting Up the Application Server and Filebeat

Now that our ELK Stack is fully installed and operational on one EC2 instance, the next step is to configure our Java application server (let’s call this the app-server) and ensure that the logs it generates are correctly forwarded to Logstash using Filebeat.

✅ 5: SSH into the App Server and Set Up the Application

Begin by connecting to your app-server EC2 instance via SSH. Once connected, execute the following commands to prepare the environment:

Update and Install Required Packages:

sudo apt update && sudo apt upgrade -y

Clone the Java Application Code:

git clone https://github.com/nishankkoul/Boardgame.git

Install Java and Maven:

Both are required to compile and run the Java application:

sudo apt install openjdk-17-jre-headless -y
sudo apt install maven -y
mvn --version

✅ 7. Install Filebeat on the App Server

Filebeat will monitor and forward logs from the application to Logstash running on the ELK server.

wget -qO - https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add -
sudo mkdir -p /etc/apt/sources.list.d
echo "deb https://artifacts.elastic.co/packages/7.x/apt stable main" | sudo tee /etc/apt/sources.list.d/elastic-7.x.list
sudo apt update
sudo apt install filebeat -y

✅ 8. Build and run the Java Application with Logging

Navigate to the project directory, build the application using Maven, and redirect logs to a file named app.log.

cd Boardgame/
mvn package
cd target/
nohup java -jar database_service_project-0.0.7.jar > /home/ubuntu/Boardgame/target/app.log 2>&1 &
  • nohup: Keeps the process running after the session ends.

  • 2>&1: Captures both stdout and stderr.

  • The application is now up and running and generating logs in /home/ubuntu/Boardgame/target/app.log.

You can access the running app by visiting:

http://<app-server-ec2-ip>:8080

✅ 9. Configure Filebeat to Send Logs to Logstash

Next, we need to modify the Filebeat configuration file so that it watches the app.log file and forwards logs to the Logstash endpoint on the ELK stack server.

Open the Filebeat Configuration File:

sudo vim /etc/filebeat/filebeat.yml

Under the “Outputs” Section, Add the Following:

filebeat.inputs:
  - type: log
    enabled: true
    paths:
      - /home/ubuntu/Boardgame/target/app.log

❌ Remove the default output.elasticsearch section entirely.

✅ Add the Logstash Output Configuration:

output.logstash:
  hosts: ["<elk-stack-server-ip>:5044"]

Start and Enable Filebeat:

sudo systemctl start filebeat
sudo systemctl enable filebeat
sudo systemctl status filebeat

Test the Filebeat Output:

To verify if Filebeat is correctly configured and able to send logs to Logstash, run:

sudo filebeat test output

This means logs are now being shipped from your application to the ELK Stack!

✅ 10. Visualizing Logs in Kibana

With the data now flowing into Elasticsearch via Filebeat and Logstash, it's time to check everything is working by exploring the logs in Kibana.

Open Kibana in the Browser

Navigate to:

http://<elk-stack-server-ip>:5601

Once the dashboard loads, you should see a message saying:

"You have data in Elasticsearch"

Create an Index Pattern

  • Click on "Create index pattern".

  • In the index name field, enter:

      logs*
    
  • Choose the time filter field. If unsure or not needed, you can skip using a timestamp.

  • Click on Create index pattern.

Explore Logs with Discover:

  • Click the hamburger menu (≡) on the top-left corner.

  • Go to "Discover".

  • Search for:

      app.log
    

You’ll now see live log entries flowing from your Java application!

🎉 Project Complete!

Congratulations! You've successfully:

✅ Set up a Java application with redirected log output
✅ Installed and configured Filebeat to collect and ship logs
✅ Parsed and indexed logs using Logstash and Elasticsearch
✅ Visualized and searched those logs in Kibana

You now have a production-grade observability setup for your Java application using open-source tools!

0
Subscribe to my newsletter

Read articles from Nishank Koul directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Nishank Koul
Nishank Koul