Inputing JSON in filebeat

Smit ShahSmit Shah
3 min read

Filebeat is a lightweight log shipper from the Elastic Stack used to collect and forward logs from various sources. While it doesn't natively parse JSON data within log lines, you can configure it to handle JSON logs and send them for further processing or simply view them on the console.

Here's how to achieve this:

1. Enabling JSON Decoding:

Filebeat can decode single-line JSON objects present in your log files. Activate this functionality within the filebeat.yml configuration file under the specific input section for your logs. Here's an example:

filebeat.inputs:

# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input-specific configurations.

# filestream is an input for collecting log messages from files.
- type: filestream

  # Unique ID among all inputs, an ID is required.
  id: my-filestream-id

  # Change to true to enable this input configuration.
  enabled: true

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - /Users/smitshah/Documents/filebeat-8.13.4-darwin-x86_64/logs/*.log
    #- c:\programdata\elasticsearch\logs\*

  processors:
    - decode_json_fields:
        fields: ["message"]
        process_array: false
        max_depth: 1
        target: ""
        overwrite_keys: false
        add_error_key: true

In the above, the field "message" has the JSON format. Hence we are instructing the processor to decode the field "message".

2. Specifying Decoded Data Placement (Optional):

By default, decoded JSON data is placed under a "json" key within the output document. You can adjust this behavior using the following options:

  • keys_under_root: Set to true to place decoded JSON keys directly at the root level of the output document.

  • overwrite_keys: If both keys_under_root and overwrite_keys are enabled, decoded JSON values overwrite conflicting fields added by Filebeat (like type or source).

3. Sending Decoded Data to Console (Not Recommended for Production):

While Filebeat typically sends data to Elasticsearch or Logstash for further processing, for testing purposes, you can temporarily route the decoded JSON to the console for verification. However, this is not recommended for production environments due to performance limitations.

output:
  console:
    enabled: true

4. Starting Filebeat:

Once you've configured filebeat.yml, save the changes and restart Filebeat using the appropriate command for your system (e.g., sudo systemctl restart filebeat on Linux).

Lets now, provide some logs. Here is a sample of data, the demo was done on:

{"app_timestamp": "2024-06-02T17:34:00Z", "app_message": "This is a sample log message", "app_level": "info"}
{"app_timestamp": "2024-06-02T17:12:00Z", "app_message": "This is a sample log message", "app_level": "info"}
{"app_timestamp": "2024-06-02T17:11:00Z", "app_message": "This is a sample log message", "app_level": "info"}
{"app_timestamp": "2024-06-01T17:11:00Z", "app_message": "This is a sample log message", "app_level": "info"}
{"app_timestamp": "2024-06-20T17:11:00Z", "app_message": "This is a sample log message", "app_level": "info"}
{"app_timestamp": "2024-06-11T17:11:00Z", "app_message": "This is a sample log message", "app_level": "info"}
{"app_timestamp": "2024-06-20T16:11:00Z", "app_message": "This is a sample log message", "app_level": "info"}
{"app_timestamp": "2024-06-11T15:11:00Z", "app_message": "This is a sample log message", "app_level": "info"}
{"app_timestamp": "2024-06-11T14:11:00Z", "app_message": "This is a sample log message", "app_level": "info"}
{"app_timestamp": "2024-06-11T14:15:00Z", "app_message": "Awesome cutting fields", "app_level": "debug"}
{"app_timestamp": "2024-06-11T14:16:00Z", "app_message": "Awesome cutting 2 fields", "app_level": "debug"}
{"app_timestamp": "2024-06-11T14:16:00Z", "app_message": "Awesome cutting 4 fields", "app_level": "debug"}
{"app_timestamp": "2024-06-11T15:16:00Z", "app_message": "Awesome cutting 5 fields", "app_level": "debug"}

The above will be processed, and the filebeat will output to the below:

{
  "@timestamp": "2024-06-02T09:53:46.167Z",
   .
   .
   .
  "app_message": "Awesome cutting 4 fields",
  "app_level": "debug",
  "message": "{\"app_timestamp\": \"2024-06-11T14:16:00Z\", \"app_message\": \"Awesome cutting 4 fields\", \"app_level\": \"debug\"}",
  "app_timestamp": "2024-06-11T14:16:00Z"
}

We are successfully able to process the input message, and it creates a specific fields.

0
Subscribe to my newsletter

Read articles from Smit Shah directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Smit Shah
Smit Shah