Centralizing VPC Flow Logs from AWS Accounts in an Organization Managed by Control Tower to a Single S3 Bucket in Log Archive Account

Lajah ShresthaLajah Shrestha
3 min read

VPC Flow logs are very cruical part of monitoring. VPC flow logs captures information about the IP traffic going to and from network interfaces within a Virtual Private Cloud (VPC). They provide detailed insights into network activity, such as identifying unusual traffic patterns, diagnosing connectivity issues, and monitoring for security threats. By analyzing VPC Flow Logs, organizations can gain valuable data insights for troubleshooting and improving network performance. These logs can be integrated into monitoring and analytics tools like Splunk or the ELK Stack (Elasticsearch, Logstash, and Kibana), enabling centralized log management, real-time visualization, and advanced threat detection across network traffic.

In this blog we’ll be looking into how we can collect VPC flow logs from multiple accounts into a single S3 bucket. This use case is needed in an AWS Organizations having multiple AWS Accounts enabling a single S3 bucket as a source of logs for creating visual dashboards out of the data.

Prerequisite(Not mandatory):

  1. AWS Organization setup

  2. Log Archive Account setup by AWS Control tower

  3. VPCs

Steps to follow:

  1. Create a central S3 Bucket(sink) in Log Archive Account

  2. Update the Destination Bucket Policy.

  3. Create VPC

  4. Enable VPC Flow logs & select the logs format

  5. Choose central Bucket as Destination.

  6. Create a central S3 Bucket(sink) in Log Archive Account

Create a central S3 Bucket(sink) in Log Archive Account

While managing AWS organization from AWS Control tower, it creates a Log Archive account which serves the purpose of storing different kinds of logs across the organization. We’ll also be centralizing the logs into a self created S3 bucket inside this account.

  • Create an S3 bucket from the Management Console.

  • After creating the S3 bucket, the bucket must be allowed to receive the logs files written into it from multiple sources. The sources in our use case are from VPC of all the aws accounts across the organization.

Bucket Policy for Centralized Flow log

Update the bucket Policy as the following:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "AWSLogDeliveryWrite",
            "Effect": "Allow",
            "Principal": {
                "Service": "delivery.logs.amazonaws.com"
            },
            "Action": "s3:PutObject",
            "Resource": [
                "arn:aws:s3:::DESTINATION_BUCKET_NAME",
                "arn:aws:s3:::DESTINATION_BUCKET_NAME/*"
            ],
            "Condition": {
                "StringEquals": {
                    "s3:x-amz-acl": "bucket-owner-full-control",
                    "aws:SourceAccount": [
                        "SOURCE_ACCOUNT_NUMBER_1",
                        "SOURCE_ACCOUNT_NUMBER_2",
                        "SOURCE_ACCOUNT_NUMBER_3"
                    ]
                },
                "ArnLike": {
                    "aws:SourceArn": [
                        "arn:aws:logs:REGION:SOURCE_ACCOUNT_NUMBER_1:*",
                        "arn:aws:logs:REGION:SOURCE_ACCOUNT_NUMBER_2:*",
                        "arn:aws:logs:REGION:SOURCE_ACCOUNT_NUMBER_3:*"
                    ]
                }
            }
        },
        {
            "Sid": "AWSLogDeliveryCheck",
            "Effect": "Allow",
            "Principal": {
                "Service": "delivery.logs.amazonaws.com"
            },
            "Action": [
                "s3:GetBucketAcl",
                "s3:ListBucket"
            ],
            "Resource": "arn:aws:s3:::DESTINATION_BUCKET_NAME",
            "Condition": {
                "StringEquals": {
                    "aws:SourceAccount": [
                        "SOURCE_ACCOUNT_NUMBER_1",
                        "SOURCE_ACCOUNT_NUMBER_2",
                        "SOURCE_ACCOUNT_NUMBER_3"
                    ]
                },
                "ArnLike": {
                    "aws:SourceArn": [
                        "arn:aws:logs:REGION:SOURCE_ACCOUNT_NUMBER_1:*",
                        "arn:aws:logs:REGION:SOURCE_ACCOUNT_NUMBER_2:*",
                        "arn:aws:logs:REGION:SOURCE_ACCOUNT_NUMBER_3:*"
                    ]
                }
            }
        }
    ]
}

Here DESTINATION_BUCKET_NAME is the name of S3 we just creatd above. You may add more aws accounts as per need in the same format and replace the placeholders like "SOURCE_ACCOUNT_NUMBER_1" and "arn:aws:logs:REGION:SOURCE_ACCOUNT_NUMBER_1:*"

Enable VPC Flow logs & select the logs format

Create a VPC if not exists already. To enable VPC flow log, select the desired VPC and navigate to the flow logs tab under the VPC and select Create flow log.

Provide a desired name and select “ Send to an Amazon S3 bucket” as Destination and place the sink bucket arn(created in earlier step). Choose the desired format and select create

After sometime you can see the logs getting populated in the S3:

The logs are stored in the following format: Bucket_name/AWSLogs/ACCOUNT_NUMBER/vpcflowlogs/region/year/month/day/logfilename.log.gz

You can follow the same step of enabling vpc flow logs for all the required accounts and the logs will get centralized in one single bucket across the whole organization.

Reference:

https://docs.aws.amazon.com/prescriptive-guidance/latest/patterns/configure-vpc-flow-logs-for-centralization-across-aws-accounts.html

0
Subscribe to my newsletter

Read articles from Lajah Shrestha directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Lajah Shrestha
Lajah Shrestha