Streamlining GCP Audit Log Management with Centralized Logging for SIEM Integration

Elias SantoroElias Santoro
4 min read

Managing logs across multiple Google Cloud Platform (GCP) projects can be a daunting task, especially when it comes to ensuring security and efficiency. One effective strategy is to centralize audit logs from various projects into a single project, where they can be safely accessed and analyzed by your SIEM tool of choice. I'll share how I achieved this using Google Cloud's logging sinks, unique writer identities, retention and bucket lock policies, all orchestrated and deployed through Terraform.

Why Centralize Logs?

Centralizing logs provides several benefits:

  1. Simplified Access: Instead of configuring your SIEM tool to connect to multiple projects, it only needs to connect to one.

  2. Enhanced Security: Reducing the number of connections your SIEM needs to make lowers the attack surface.

  3. Improved Management: Easier to implement and manage retention policies, access controls, and other logging configurations.

Setting Up Centralized Logging

  1. Creating Storage Buckets in the Centralized Logging Project

First, create a dedicated GCP project to serve as the centralized logging repository. This project will host the storage buckets where logs from other projects will be stored.

resource "google_storage_bucket" "centralized_logging_bucket" {
  name     = "centralized-logging-bucket"
  location = "US"

  lifecycle_rule {
    action {
      type = "Delete"
    }
    condition {
      age = 365
    }
  }

  retention_policy {
    retention_period = 31536000
    is_locked        = true
  }
}
The bucket lock feature is not reversible. Once a retention policy is locked, it cannot be removed or reduced. Ensure you have thoroughly reviewed your retention requirements and compliance needs before applying a bucket lock. Make sure you are confident in the retention period you set.
  1. Configure Logging Sinks in Each Source Project

For each GCP project that needs to send logs to the centralized project, configure a logging sink. The sinks should be set up in the source projects and point to the storage bucket in the centralized project as the destination.

Basically a logging sink in GCP is a configuration that routes log entries to a specified destination. It has three main components:

  • Destination: Specifies where the logs should be sent, such as a Cloud Storage bucket, BigQuery dataset, or Pub/Sub topic.

  • Filter: Determines which log entries are routed by the sink based on criteria like log names or resource types.

  • Unique Writer Identity: A unique service account created for the sink that has write permissions to the destination, enhancing security by limiting access.

resource "google_logging_project_sink" "my_project_sink" {
  count         = length(var.project_ids)
  name          = "centralized-logging-sink"
  project       = var.project_ids[count.index]
  destination   = "storage.googleapis.com/${google_storage_bucket.centralized_logging_bucket.name}"
  filter        = "logName:\"logs/cloudaudit.googleapis.com\""
  unique_writer_identity = true
}
💡
You can tune and modify the filter component of the logging sink to match your specific needs, ensuring that only the relevant log entries are routed to your chosen destination.
  1. Add IAM Bindings for Unique Writer Identities

If your GCP projects are managed across different Terraform workspaces, this step ensures that each logging sink has write access to the centralized storage bucket. If all projects are within the same Terraform workspace, you can add the IAM binding when creating the storage bucket.

The binding must use the GCP unique writer identity in the format: serviceAccount:service-${gcp-source-project-id}@gcp-sa-logging.iam.gserviceaccount.com

where gcp-source-project-id is the google id assigned to the project.

resource "google_storage_bucket_iam_member" "logging_sink_writer" {
  count    = length(var.project_ids)
  bucket   = google_storage_bucket.centralized_logging_bucket.name
  role     = "roles/storage.objectCreator"
  member   = serviceAccount:service-${gcp-source-project-id}@gcp-sa-logging.iam.gserviceaccount.com
}

Now you should start seeing logs from the different source projects being written and stored in the bucket/s on your target secure project

  1. Configure the SIEM Tool

The configuration of the SIEM tool depends on the specific tool you are using. Follow the tool's documentation to authenticate with GCP and set up log collection from the centralized bucket. Typically, you'll need to add read IAM bindings to the storage bucket to allow the SIEM tool to access and read the logs. Ensure you follow the detailed instructions provided by your SIEM tool's documentation for proper integration.

In conclusion, centralizing your GCP audit logs into a single project offers numerous advantages, including simplified access, enhanced security, and easier management. By using Google Cloud’s logging sinks with unique writer identities and implementing proper retention and bucket lock policies, you can efficiently and securely aggregate logs from multiple projects. Terraform simplifies this entire process, making it scalable and repeatable. Once logs are centralized, integrating them with your SIEM tool ensures robust monitoring and threat detection across your cloud environment. This approach not only strengthens your security posture but also streamlines your log management efforts, making it a best practice for organizations managing multiple GCP projects

0
Subscribe to my newsletter

Read articles from Elias Santoro directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Elias Santoro
Elias Santoro