n8n Cloud Automation Made Easy — Stop Repeating, Start Automating! [Part 1]

Kanak VyasKanak Vyas
10 min read

Tired of manually running scripts to track unused cloud resources and rushing to update Google Sheets on time? We’ve all been there. Now, imagine a system that automatically fetches resources — like available EBS volumes — updates your Google Sheet on schedule, and even sends you a clean email summary.

Sounds like a dream?
With n8n’s automation magic, you can set this up once and let it run on autopilot — weekly, monthly, or whenever you need. 🚀

This is Part 1 of our automation journey — keeping the flow simple and focused. We’ll track unused AWS resources across all regions, update a Google Sheet and send out an email. No complex setups, just clean, efficient automation. Let’s dive in!

About n8n

n8n (pronounced either en-eight-en or nodemation) is an automation tool that lets you create custom workflows. Think of it as a digital assistant that handles repetitive work for you by scheduling tasks, integrating services, or automatically moving data between different apps that don’t normally communicate with each other out of the box.

🛠️ Prerequisites

Before we start, make sure you’ve got:

  • Public EC2 Machine of type t2.medium should be enough

  • Docker installed on your EC2 Machine

  • AWS Instance Profile with permissions to read resource data

  • Google: OAuth2 single service — Google Sheet API, Gmail API Enabled via google cloud.

  • A Google Sheet, Column Name set up for tracking data

  • A Python script to fetch AWS data (like available EBS volumes)

  • Gitlab Repo to store the Python scripts.

🐳 n8n Installation (self-hosted, Docker)

We’ll start by running n8n in Docker and ensuring it has the right dependencies. Since we’re using Python scripts with AWS SDK (boto3), let’s bake that into the Docker image. Dockerfile:

FROM docker.n8n.io/n8nio/n8n:latest

# Install Python and dependencies
USER root
RUN apk add --no-cache \
    python3 \
    py3-pip \
    py3-virtualenv && \
    python3 -m venv /home/node/venv && \
    /home/node/venv/bin/pip install --no-cache-dir \
    boto3 \
    google-api-python-client

# Ensure virtual environment Python and pip are accessible
ENV PATH="/home/node/venv/bin:$PATH"

# Switch back to node user for n8n
USER node

Run this with: Use this for local development and testing. It isn't safe to use it in production.

docker volume create n8n_data

docker build -t custom-n8n-python .

docker run -it --rm --name n8n -d \
  -p 5678:5678 \
  -v n8n_data:/home/node/.n8n \
  -v /home/ec2-user/local-scripts:/home/node/scripts \
  -e GENERIC_TIMEZONE="Asia/Kolkata" \
  -e TZ="Asia/Kolkata" \
  -e N8N_SECURE_COOKIE=false \
  --network="host" \
  custom-n8n-python start --tunnel

You must use a proper reverse proxy (like Nginx) with SSL instead of tunnel and setting N8N_SECURE_COOKIE to true.

Above command creates a volume to store persistent data, downloads the required n8n image, and starts your container, exposed on port 5678. To save your work between container restarts, it also mounts a docker volume, n8n_data, to persist your data locally.

Once running, you can access n8n by opening: <EC2 Public IP>:5678

🧠 Building the n8n Workflow

Here’s the final workflow I’ve built:

💡
Nodes are the key building blocks of a workflow.

Let’s break it down step by step:

  1. Schedule Trigger: Use the Schedule Trigger node to run workflows at fixed intervals and times. This works in a similar way to the Cron software utility in Unix-like systems.

  2. GitLab Node — We fetch our Audit python script directly from GitLab. This script checks for available AWS EBS volumes.

    Let’s break down this n8n GitLab node setup! ! 🛠️
    Parameters:

    • Credential to connect with: GitLab account — Authenticates n8n with the right GitLab account. I’ve used API access token to Authenticate with GitLab - API access token Link

    • Resource: File — We’re working with a file in the repo.

    • Operation: Get — Just fetching, no changes — safe and simple.

    • Project Owner: groupA — The GitLab group that owns the project.

    • Project: n8n-scripts-poc — The specific repo we’re pulling from.

    • File Path: fetch_ebs_details.py — The Python script we need.

    • As Binary Property: Enabled — n8n treats the file contents as binary data instead of a text string. Binary format is often used for files like images, PDFs, or scripts. Why it matters: Binary mode prevents issues with encoding and formatting — especially when downloading executable files like Python scripts. It preserves the file exactly as it is.

    • Output Field: data — This sets the name of the output field where the binary file data will be stored in n8n’s workflow. The next nodes will reference this field to access the file.

    • Reference: Value: main The branch name from which the file should be fetched.

  3. Read/Write Files from Disk Node — We save the pulled Python script to a temp directory:

    • Operation: Write File to Disk to store the python script temp directory.

    • File Path: /tmp/fetch_ebs_details.py

    • Input Binary Field: data Reference the previous node output file name to access the file.

  4. Execute Command Node — Runs the script inside the container. Instead of using n8n’s Python (Beta) node (which lacks certain libraries like boto3), we execute the script directly:
    Parameters:

    • Execute Once: Enabled

Command:

    /home/node/venv/bin/python3 /tmp/fetch_ebs_details.py
  1. Code Node — Transform the output
    We convert the script’s output into an array of dictionaries, so it’s ready for Google Sheets.
    Setting:

    • Always Output Data — Enabled

Parameters:

  • Mode: Run Once for All Items — Runs the operation just once, processing all incoming items in one go.
    Language: JavaScript — The script is written in JavaScript for transforming and parsing the data.

      const output = $json["stdout"]
          .replace(/^Checking.*?\n\[/s, "[") // Remove the initial non-JSON text
          .replace(/\n/g, "") // Remove line breaks
          .trim();
    
      try {
          const parsed = JSON.parse(output);
          return parsed.map(item => ({ json: item }));
      } catch (error) {
          throw new Error("Failed to parse JSON: " + error.message);
      }
    
  1. Google Sheets Node — Update the existing audit sheet
    Parameters:

    • Credential to connect with: Google Sheets account — Uses an authenticated Google Sheets account to access and modify the sheet.
      For Setup Refer - Connecting Gmail & Google Sheets via Google Cloud API Section.

    • Resource: Sheet Within Document — Specifies that this node will interact with a specific sheet inside a Google Sheets document.

    • Operation: Append or Update Row — If a row with a matching key exists, it updates it; if not, it appends a new row.

    • Document: n8n-automation-audit — The Google Sheets document where data will be written.

    • Sheet: available_ebs_volumn — The specific tab within the document where data will be added or updated.

    • Mapping Column Mode: Map Each Column Manually — Allows manual mapping of the data fields to specific columns in the sheet.

    • Column to match on: Volume_ID — Uses Volume_ID as the key for matching existing rows. If the Volume_ID already exists, the row gets updated—otherwise, a new row is added.

Values to Send:

  • Volume_ID (using to match): {{ $json.Volume_ID }}

  • Account_ID: {{ $json.Account_ID }}

  • Volume_Name: {{ $json.Volume_Name }}

  • Size: {{ $json.Size }}

  • Snapshot_ID: {{ $json.Snapshot_ID }}

  • Created_Date: {{ $json.Created_Date }}

  • Region: {{ $json.Region }}

  • Tags: {{ $json.Tags }}

Node Setting:

  • Always Output Data: Enabled
  1. Gmail Node — Send the results
    Parameters:

Body:

    <!DOCTYPE html>
    <html>
      <head>
        <style>
          body {
            font-family: Arial, sans-serif;
            line-height: 1.6;
            color: #333333;
            background-color: #f9f9f9;
            margin: 0;
            padding: 0;
            display: flex;
            justify-content: center;
            align-items: center;
            min-height: 100vh;
          }
          .container {
            background: #ffffff;
            border-radius: 10px;
            padding: 30px;
            max-width: 600px;
            box-shadow: 0 2px 8px rgba(0, 0, 0, 0.1);
          }
          h3 {
            color: #333333; /* Changed to black */
            font-size: 18px;
            margin-bottom: 20px;
          }
          p {
            font-size: 16px;
            margin: 10px 0;
            text-align: left; /* Left-aligned text */
          }
          a {
            color: #007bff;
            text-decoration: none;
            font-weight: bold;
          }
          a:hover {
            text-decoration: underline;
          }
          .footer {
            margin-top: 30px;
            font-size: 14px;
            color: #666666;
            text-align: left; /* Left-aligned footer */
          }
        </style>
      </head>
      <body>
        <div class="container">
          <h3>Hi Team,</h3>
          <p>
            The Google Sheet has been successfully updated for Available EBS Resources  
            <strong>{{ $("Google Sheets").params.documentId.cachedResultName }}</strong>  
            and Tab:  
            <strong>{{ $("Google Sheets").params.sheetName.cachedResultName }}</strong>.  
            You can access it using the link below:
          </p>
          <p>
            <a href="{{ $("Google Sheets").params.sheetName.cachedResultUrl }}" target="_blank">
              Open the Google Sheet
            </a>
          </p>
          <p>Let me know if you run into any issues.</p>
          <div class="footer">
            Best regards,<br>
            The DevOps Team
          </div>
        </div>
      </body>
    </html>

Setting:

  • Always Output Data - Enabled

  • Execute Once - Enabled

  1. Execute Command Node — (Optional) Cleans up any temporary files.

Result

  • Excel Sheet

  • Email


📝 The Python Script

This is the Python script that fetches available EBS volumes across regions:

import boto3
import json

def get_all_regions(session):
    ec2_client = session.client('ec2', region_name='us-east-1')  # Default region needed
    response = ec2_client.describe_regions()
    return [region['RegionName'] for region in response['Regions']]

def get_available_ebs_volumes(session):
    ebs_volumes_list = []
    regions = get_all_regions(session)

    for region in regions:
        ec2_client = session.client('ec2', region_name=region)
        volumes = ec2_client.describe_volumes(Filters=[{'Name': 'status', 'Values': ['available']}])['Volumes']

        for volume in volumes:
            volume_name = next((tag['Value'] for tag in volume.get('Tags', []) if tag['Key'] == 'Name'), 'N/A')
            volume_id = volume['VolumeId']
            size = volume['Size']
            snapshot_id = volume.get('SnapshotId', 'N/A')
            created_date = volume['CreateTime'].strftime('%Y-%m-%d %H:%M:%S')
            tags = [f"{tag['Key']}={tag['Value']}" for tag in volume.get('Tags', [])]

            ebs_volumes_list.append({
                "Account_ID": session.client('sts').get_caller_identity()['Account'],
                "Volume_Name": volume_name,
                "Volume_ID": volume_id,
                "Size": size,
                "Snapshot_ID": snapshot_id,
                "Created_Date": created_date,
                "Region": region,
                "Tags": ', '.join(tags)
            })

    return ebs_volumes_list

if __name__ == "__main__":
    session = boto3.Session()
    account_id = session.client('sts').get_caller_identity()['Account']
    ebs_volumes_list = get_available_ebs_volumes(session)
    # Output structured JSON for n8n
    print(json.dumps(ebs_volumes_list, indent=2))

🌐 Connecting Gmail & Google Sheets via Google Cloud API

1. Set up Google Cloud Project: Go to Google Cloud Console and create a new project.

2. Enable APIs: Enable Gmail API and Google Sheets API from the "APIs & Services" section.

3. Configure your OAuth consent screen: If you haven't used OAuth in your Google Cloud project before, you'll need to configure the OAuth consent screen:
For User Type, select Internal for user access within your organization's Google workspace or External for any user with a Google account. Enter the essential information:

  • App name

  • User support email

  • Email addresses field in Developer contact information

  • In the Authorized domains section, add n8n.cloud

  1. Create your Google OAuth client credentials

    Next, create the OAuth client credentials in Google:

    • In the APIs & Services section, select Credentials.

    • Select + CREATE CREDENTIALS > OAuth client ID.

    • In the Application type dropdown, select Web application.

    • Google automatically generates a Name. Update the Name to something you'll recognize in your console.

    • From your n8n credential, copy the OAuth Redirect URL. Paste it into the Authorized redirect URIs in Google Console.

    • Select CREATE.

  2. Data access: Allow your project to access specific types of private user data from their Google Account. Choose the necessary scopes for Gmail and Google Sheets API access.

  3. Finish your n8n credential: With the Google project and credentials fully configured, finish the n8n credential:

    • From Google's OAuth client created modal, copy the Client ID. Enter this in your n8n credential.

    • From the same Google modal, copy the Client Secret. Enter this in your n8n credential.

    • In n8n, select Sign in with Google to complete your Google authentication.

    • Save your new credentials.


⏳ Final words

Once this is set up, schedule the workflow to run weekly (or whenver you want) in n8n. No more manual script runs, no more forgotten updates — just clean, automated AWS resource tracking.

So go ahead, set this up, and let n8n do the heavy lifting. You’ve got better things to do! 💪

And we have arrived at the end of the part-1. I hope you found it insightful, I would be happy to read your experiences in the comment section 👍

Got questions or ideas to tweak this for your own use case? Let me know in the comments! 🚀

Writing runs on coffee. Help keep the words flowing by buying me a coffee, using the button below.

1
Subscribe to my newsletter

Read articles from Kanak Vyas directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Kanak Vyas
Kanak Vyas

Driving Efficiency in DevOps Realms ∞