Automating Docker Volume Backups to Azure Blob Storage Also Setting Cron Job

Table of contents
- Introduction
- Step 1: Create an Azure Blob Storage Account and Container
- Step 2: Generate a Shared Access Signature (SAS) Token
- Install AzCopy (if not installed)
- Step 3: Create a Shell Script to Back Up Docker Volume
- Step 4: Automate the Backup with a Cron Job
- Step 5: Set Up Lifecycle Management to Delete Old Backups
- Conclusion
Introduction
When running applications in Docker, persistent data stored in Docker volumes is critical. To ensure data safety, regular backups are necessary. This guide walks you through the process of:
Creating an Azure Storage Account and a Blob Container.
Generating a Shared Access Signature (SAS) Token for secure uploads.
Writing a Bash script to back up Docker volumes.
Automating backups using a cron job to run daily at 11 AM.
Step 1: Create an Azure Blob Storage Account and Container
1.1 Create an Azure Storage Account
Go to the Azure Portal.
Search for Storage Accounts in the search bar and click Create.
Fill in the required details:
Subscription: Choose your Azure subscription.
Resource group: Create a new one or use an existing one.
Storage account name: Provide a unique name (e.g.,
mydockerbackups
).Region: Select a region close to your deployment.
Performance: Standard.
Redundancy: Locally-redundant storage (LRS) is sufficient.
Click Review + Create and wait for deployment to complete.
1.2 Create a Blob Storage Container
Navigate to the Storage Account you just created.
Click Containers > + Container.
Name it
terrabackup
.Set the public access level to
Private (no anonymous access)
.Click Create.
Step 2: Generate a Shared Access Signature (SAS) Token
In your Storage Account, go to Security + networking > Shared access signature.
Select the following permissions:
Allowed services: Blob
Allowed resource types: Container, Object
Allowed permissions: Read, Write, Delete, List
Set the Start and Expiry date (e.g., valid for 1 year).
Click Generate SAS and connection string.
Copy the Blob SAS URL (we’ll use it in the backup script).
Install AzCopy (if not installed)
Before running the backup script, you need AzCopy, a command-line tool used to transfer data to and from Azure Blob Storage.
3.1 Install AzCopy on Linux
Run the following commands in your terminal:
# Download AzCopy
wget https://aka.ms/downloadazcopy-v10-linux
# Expand Archive
tar -xvf downloadazcopy-v10-linux
# (Optional) Remove existing AzCopy version
sudo rm /usr/bin/azcopy
# Move AzCopy to the destination you want to store it
sudo cp ./azcopy_linux_amd64_*/azcopy /usr/bin/
# Verify installation
azcopy --version
If the installation is successful, you should see the AzCopy version displayed.
Step 3: Create a Shell Script to Back Up Docker Volume
3.1 Write the Backup Script
Create a new shell script file:
nano /home/azureuser/docker_volume_backup.sh
Add the following content:
#!/bin/bash
# Variables
VOLUME_NAME="volume_name"
AZURE_STORAGE_URL="https://serverbackuplogs.blob.core.windows.net/terranex"
SAS_TOKEN="your-token"
TIMESTAMP=$(date +"%d-%m-%Y_%H-%M-%S")
# Ensure azcopy is installed
if ! command -v azcopy &> /dev/null
then
echo "🚧AzCopy is not installed. Please install it first.🚧"
exit 1
fi
# Create a temporary container and mount the volume, then directly stream to Azure
echo "⌛Copying Docker volume: $VOLUME_NAME directly to Azure Blob Storage⌛..."
docker run --rm -v $VOLUME_NAME:/volume alpine tar -czf - -C /volume . | azcopy copy "https://serverbackuplogs.blob.core.windows.net/uatlogs/$TIMESTAMP.tar.gz?${SAS_TOKEN}" --from-to=PipeBlob
if [ $? -eq 0 ]; then
echo "Backup successfully uploaded to Azure Blob Storage as $TIMESTAMP.tar.gz."
else
echo "Upload failed!"
exit 1
fi
echo "✅Backup process completed.✅"
Explanation of the Script
Define Variables:
VOLUME_NAME
: Specifies the Docker volume to be backed up.AZURE_STORAGE_URL
: The URL of the Azure Blob Storage container.SAS_TOKEN
: The Shared Access Signature for authentication.TIMESTAMP
: Adds a timestamp to the backup file name.
Check if AzCopy is Installed:
- The script checks whether
azcopy
is installed; if not, it exits with an error message.
- The script checks whether
Backup Process:
- The script runs a Docker container with Alpine Linux, mounts the specified Docker volume, compresses it (
tar -czf
), and streams the backup directly to Azure Blob Storage usingazcopy
.
- The script runs a Docker container with Alpine Linux, mounts the specified Docker volume, compresses it (
Upload Verification:
- If the upload is successful, a success message is displayed; otherwise, the script exits with an error.
3.2 Give the Script Execute Permission
chmod +x /home/azureuser/docker_volume_backup.sh
Step 4: Automate the Backup with a Cron Job
4.1 Add a Cron Job
Edit the crontab file:
crontab -e
Add the following line at the end to schedule daily backups at 11 AM:
0 11 * * * /home/azureuser/docker_volume_backup.sh >> /var/log/backup.log 2>&1
4.2 Save and Exit
If using nano, press
CTRL + X
, thenY
, and pressEnter
.If using vim, press
ESC
, type:wq
, and pressEnter
.
4.3 Verify Cron Job
Run the following command to list scheduled cron jobs:
crontab -l
It should display the job scheduled to run at 11 AM daily.
Step 5: Set Up Lifecycle Management to Delete Old Backups
Azure doesn’t delete old backups automatically unless you configure Lifecycle Management. To retain only the last 6 days of backups:
Go to your Storage Account.
Navigate to Data management > Lifecycle Management.
Click + Add a rule.
Configure:
Rule Name:
DeleteOldBackups
Scope:
terrabackup/
Blob type:
Block blobs
Days since last modification:
4
Action:
Delete the blob
Click Save.
Conclusion
By following this guide, you have successfully:
Created an Azure Blob Storage Account and a container for backups.
Generated a SAS token for secure access.
Written a shell script to backup Docker volumes.
Scheduled a cron job for daily automation.
Configured Azure Lifecycle Management to remove old backups.
This ensures that your Docker volume data is safely backed up and efficiently managed. 🚀
Subscribe to my newsletter
Read articles from Deepesh Gupta directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Deepesh Gupta
Deepesh Gupta
DevOps & Cloud Enthusiast | Open Source Contributor | Driving Technological Excellence in DevOps and Cloud Solutions