Filebeat Integration with Spring Boot Application

Smit ShahSmit Shah
2 min read

In the preceding article, we explored Logstash's capability to ingest logs from a file. In this subsequent integration discussion, we will examine the comprehensive end-to-end integration process for handling logs. This process involves the local storage of logs in a file, their subsequent retrieval by Filebeat, and further processing by Logstash. As this topic builds upon the prior article's content, I encourage you to review it before proceeding: https://shahsmit.hashnode.dev/logstash-x-spring-boot-integration

Please ensure to use elastic-search, logstash and filebeat above version 8.

We will be using following components of Elastic Stack:

Elasticsearch: https://www.elastic.co/downloads/elasticsearch

Logstash: designed to collect, parse, and transform diverse types of data from various sources.https://www.elastic.co/downloads/logstash

Filebeat: Used for monitoring log files and forwards events to Logstash or Elasticsearch. https://www.elastic.co/downloads/beats/filebeat

Lets configure the filebeat to read the logs file:

filebeat.inputs:
- type: filestream
  enabled: true
  paths:
  - /Users/Smit/Downloads/demo/myapp.log

output.logstash:
  hosts: ["localhost:5044"]

bit of the information on the fields used above:

  • filebeat.inputs: Defines the input source for FileBeat.

  • type: filestream: Specifies that we're monitoring a file.

  • paths: Lists the path to the log file you want to monitor.

  • output.logstash: Configures the output to send data to Logstash.

  • hosts: Specifies the Logstash host and port to connect to.

Now lets update the logstash configuration to read the filebeat, and upload to elastic-search

input {
  beats {
    port => 5044
  }
}

filter {
    grok {
        match => { "message" => "%{TIMESTAMP_ISO8601:log-timestamp}%{SPACE}%{LOGLEVEL:log-level} %{NUMBER:log-process-id} --- \[%{DATA:log-tomcat-metadata}\] %{DATA:log-class-name} %{SPACE} : %{GREEDYDATA:log-message}" }
    }
}

output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    index => "filebeat-myapp-%{+YYYY.MM.dd}"
  }
}
  • input: Defines the input source for Logstash.

  • beats: Specifies that Logstash will receive data from Beats (like FileBeat).

  • port: Sets the port to listen on for incoming Beats connections.

  • output: Configures the output to send data to Elasticsearch.

  • elasticsearch: Specifies the Elasticsearch output plugin.

  • hosts: Sets the Elasticsearch host and port to connect to.

  • index: Specifies the index pattern for storing the data in Elasticsearch.

Please pay attention to the grok pattern utilized above. As it has been previously explained in another article, it will not be elaborated on here for brevity.

In Conclusion, we have look at a simple integration using Filebeat. Please feel free to connect me via Twitter (also called 'X') or comment below if any issue or feedback.

0
Subscribe to my newsletter

Read articles from Smit Shah directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Smit Shah
Smit Shah