☁️ Seamless Cloud Backup & Restore using Rclone: Pcloud ➡️ AWS S3 Automation


Aim 🎯
The aim of this task is to securely backup and restore data from another cloud (Pcloud) to AWS S3. 💾🔒
To achieve this, I performed the entire task using Rclone, created an automated Bash script, and scheduled Cron jobs to ensure continuous, safe, and hassle-free data synchronization. 🌐✅
Introduction 📖
Data is the most valuable asset 💻💾, and keeping it safe and accessible is crucial. In this task, I used Rclone with sync to securely backup and restore data from Pcloud to AWS S3 ☁️➡️☁️. With a Bash script 🖥️ and Cron jobs ⏰, the process is fully automated, secure, and hassle-free ✅.
📁 Agenda
🛠️ Prerequisites Setup
🖥️ EC2 Instance Configuration
👤 IAM User & S3 Bucket Creation
📦 Rclone Installation & Remote Configuration
💻 Backup & Restore Script Development
⏱️ Cron Automation Setup
🧪 Testing & Verification
🏁 Conclusion
🛠️ Prerequisites
✅ AWS Account with EC2 access
✅ PCloud account(any other cloud) with data to backup
✅ Basic Linux command knowledge
✅ SSH access to Ubuntu server
🖥️ Step 1: EC2 Instance Setup
Launch Ubuntu EC2 Instance
# Instance Type: t3.micro (Free tier eligible)
# AMI: Ubuntu Server 22.04 LTS
# Security Group: Allow SSH (Port 22)
# Key Pair: Create or use existing
Connect to Instance
👤 Step 2: IAM User & S3 Bucket Creation
🔏Create IAM User
Go to AWS IAM Console
Create new user:
rclone-backup-user
Attach policy:
AmazonS3FullAccess
Generate Access Keys (save securely)
🪣Create S3 Bucket
📦 Step 3: Rclone Installation & Configuration
Install Rclone
sudo apt update
sudo apt install -y rclone
rclone version
Configure PCloud Remote
# Choose: n (New remote)
# Name: pcloud
# Storage: pcloud
# Follow OAuth authentication process
# Add your pcloud token
Use the command below to generate a token for pCloud and download it locally:
rclone authorize "pcloud"
Configure AWS S3 Remote
rclone config
# Choose: n (New remote)
# Name: s3
# Storage: s3
# Provider: AWS
# Access Key ID: [Your IAM Access Key]
# Secret Access Key: [Your IAM Secret Key]
# Region: us-east-1
# Endpoint: [Leave blank]
# Location constraint: [Leave blank]
Verify Remotes
rclone listremotes
💻 Step 4: Create Backup & Restore Script
Create Script Directory
Create backup_restore.sh
vim backup_restore.sh
Script Content
#!/bin/bash
# Remotes
PCLOUD_REMOTE="pcloud:"
S3_REMOTE="aws_s3:mybucket1-apurv"
# Log files
BACKUP_LOG="/var/log/pcloud_to_s3_backup.log"
RESTORE_LOG="/var/log/s3_to_pcloud_restore.log"
# ----------------------------
# 1. Backup: pCloud → S3
# ----------------------------
echo "Starting backup from pCloud to S3..."
rclone sync $PCLOUD_REMOTE $S3_REMOTE --log-level=ERROR >> $BACKUP_LOG 2>&1
echo "Backup completed. Log: $BACKUP_LOG"
# ----------------------------
# 2. Restore: S3 → pCloud
# ----------------------------
echo "Starting restore from S3 to pCloud..."
rclone sync $S3_REMOTE $PCLOUD_REMOTE --log-level=ERROR >> $RESTORE_LOG 2>&1
echo "Restore completed. Log: $RESTORE_LOG"
Remotes:
*
$PCLOUD_REMOTE
→ pCloud account remote.*
$S3_REMOTE
→ AWS S3 bucket remote.* Log files:
*
$BACKUP_LOG
→ stores backup errors/output.*
$RESTORE_LOG
→ stores restore errors/output.*Backup (pCloud → S3) –
rclone sync
copies data from pCloud to S3 and logs errors.*Restore (S3 → pCloud) –
rclone sync
restores data back from S3 to pCloud with logging.*
--log-level=ERROR >> $RESTORE_LOG 2>&1
→ logs errors only.* Echo messages: Show start and completion of backup/restore in terminal.
Make Script Executable
chmod +x backup_restore.sh
🔄 Step 5: Manual Testing
Test Backup
./backup_restore.sh
⏱️ Step 6: Automate with Cron
Open Crontab
crontab -e
Add Cron Job (Every 1 minute)
*/1 * * * * /home/ubuntu/backup_restore.sh >> /var/log/backup_restore_cron.log 2>&1
A. List Files in /home/ubuntu/
Directory
ls -l /home/ubuntu/
Displays a detailed list of all files and directories inside /home/ubuntu/
.
B. Create a Folder Named myfiles
mkdir -p /home/ubuntu/myfiles
Creates a folder called myfiles
inside /home/ubuntu/
. The -p
option ensures the command won’t give an error if the folder already exists.
C. Create test1.txt
with Sample Content
echo "Test file" > /home/ubuntu/myfiles/test1.txt
Creates a file named test1.txt
inside myfiles
and writes Test file
into it.
D. Create test2.txt
with Sample Content
echo "Another file" > /home/ubuntu/myfiles/test2.txt
Creates a file named test2.txt
inside myfiles
and writes Another file
into it.
E. Copy Files to AWS S3 (Dry Run)
rclone copy /home/ubuntu/myfiles aws_s3:mybucket1-apurv --dry-run --log-level=ERROR
Uses rclone to copy all files from /home/ubuntu/myfiles
to the AWS S3 bucket named mybucket1-apurv
.
The --dry-run
option simulates the copy process without actually transferring the files.
F. Sync Files from pCloud to AWS S3
rclone sync pcloud: aws_s3:mybucket1-apurv --log-level=ERROR
Uses rclone to sync files from the pCloud storage to the AWS S3 bucket mybucket1-apurv
.--log-level=ERROR
shows only error messages during execution.
🧪 Step 7: Testing & Verification
Case Study 1: Existing Files
✅ All existing PCloud files synced to S3
✅ Directory structure maintained
✅ File integrity preserved
Case Study 2: New File Addition
Add new file to PCloud
Wait 1 minute
Check S3 bucket → File automatically appears
Verify in logs
I uploaded ad.jpg
to pCloud, remove text file , the cron job automatically synced it to the S3 bucket within a minute. Screenshot of the automatic sync is attached below.
⚠️ Challenges & ✅ Success in Automating Secure Backup from pCloud to AWS S3
Task Overview & Challenges:
⏳ Faced major difficulties over 2 days while setting up backup and restore between pCloud and S3.
💾 Backup worked, but restore had issues initially.
Steps Taken & Solutions:
📤 Uploaded data to pCloud, but cron job was not picking up latest changes immediately.
❌ Initially, latest files were missing in backup due to timing issues.
🔄 Switched to using
rclone sync
, which automatically updates S3 whenever any change happens in pCloud.⏱️ Cron job now syncs every minute, ensuring real-time backup.
✅ Tested with
rclone copy
to verify that the remote connection and permissions were correct.
Results:
🛠️ After solving all errors, backup is now working perfectly.
🔒 All data is synced securely and automatically from pCloud → AWS S3.
📸 Screenshots attached show successful backup and automation in action.
Conclusion for Challenges :
🤖 Automated backup with cron + rclone ensures highly secure, real-time sync.
✨ Reduces manual effort and guarantees up-to-date cloud storage.
conclusion:
✅ Learned how to automate secure cloud backup using pCloud, AWS S3, cron, and Rclone.
💡 Gained hands-on experience in troubleshooting errors, syncing data, and ensuring real-time updates.
🏆 Successfully set up a fully automated, secure, and reliable backup system.
👨💻 About the Author
This series isn't just about using AWS; it's about mastering the core services that power modern cloud infrastructure.
📬 Let's Stay Connected
📧 Email: gujjarapurv181@gmail.com
🐙 GitHub: github.com/ApurvGujjar07
💼 LinkedIn: linkedin.com/in/apurv-gujjar
Subscribe to my newsletter
Read articles from Gujjar Apurv directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Gujjar Apurv
Gujjar Apurv
Gujjar Apurv is a passionate DevOps Engineer in the making, dedicated to automating infrastructure, streamlining software delivery, and building scalable cloud-native systems. With hands-on experience in tools like AWS, Docker, Kubernetes, Jenkins, Git, and Linux, he thrives at the intersection of development and operations. Driven by curiosity and continuous learning, Apurv shares insights, tutorials, and real-world solutions from his journey—making complex tech simple and accessible. Whether it's writing YAML, scripting in Python, or deploying on the cloud, he believes in doing it the right way. "Infrastructure is code, but reliability is art."