Export CloudWatch Logs to S3 in File Format

You may want to compile all log streams within a specific log group into a single file for analysis or debugging purposes.

First, you need to create a bucket in the same region as the CloudWatch Log Group.

aws s3api create-bucket --bucket app-logs --create-bucket-configuration LocationConstraint=us-west-2

Next, you must modify the bucket policy to ensure the CloudWatch Log Exporter can write to it. Here is the policy document:

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Action": "s3:GetBucketAcl",
      "Effect": "Allow",
      "Resource": "arn:aws:s3:::app-logs",
      "Principal": {
        "Service": "logs.us-west-2.amazonaws.com"
      }
    },
    {
      "Action": "s3:PutObject",
      "Effect": "Allow",
      "Resource": "arn:aws:s3:::app-logs/*",
      "Principal": {
        "Service": "logs.us-west-2.amazonaws.com"
      }
    }
  ]
}

Use the following command to apply the policy to the bucket.

aws s3api put-bucket-policy --bucket app-logs --policy file://policy.json

Next, initiate an export job that will transfer all log streams from a specific log group into the previously created S3 bucket. You also need to specify the range in Unix timestamp format.

aws logs create-export-task --task-name "app-logs-group-1" \
    --log-group-name "prod/app-logs" \
    --from 1704045600000 --to 1704132000 \
    --destination "app-logs" --destination-prefix "prefix1"

The command above will produce a task ID. You can query the task ID to check whether the export job has been completed.

aws logs describe-export-tasks --task-id d6f1d52c-2783-4145-9668-4f5cc5579f41

Once complete, you can simply download the bucket content to your local machine and analyze it.

aws s3 sync s3://app-logs ./logs
0
Subscribe to my newsletter

Read articles from Md. Minhazul Haque directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Md. Minhazul Haque
Md. Minhazul Haque

DevOps Engineer | Kubernetes | Python | Terraform | AWS | GCP