AWS to GCP VM Migration

This article explains the procedure to migrate an AWS EC2 VM to GCP.
Create an EC2 instance with only root volume
EC2 vm can be created using the AWS console or CLI.
Create a S3 bucket
To store the .ova file generated during the instance export task.
Create an EC2 instance export task using the CLI
aws ec2 create-instance-export-task --instance-id i-xxxxxxxxxxxx --target-environment vmware --export-to-s3-task DiskImageFormat=VMDK,ContainerFormat=ova,S3Bucket=bucket_name
For this to run successfully your S3 bucket should have an ACL containing the following grants. Please use the grant specific to your region
Africa (Cape Town) :- 3f7744aeebaf91dd60ab135eb1cf908700c8d2bc9133e61261e6c582be6e33ee
Asia Pacific (Hong Kong) :- 97ee7ab57cc9b5034f31e107741a968e595c0d7a19ec23330eae8d045a46edfb
Asia Pacific (Hyderabad) :- 77ab5ec9eac9ade710b7defed37fe0640f93c5eb76ea65a64da49930965f18ca
Asia Pacific (Jakarta) :- de34aaa6b2875fa3d5086459cb4e03147cf1a9f7d03d82f02bedb991ff3d1df5
Asia Pacific (Malaysia) :- ed006f67543afcfe0779e356e52d5ed53fa45f95bcd7d277147dfc027aaca0e7
Asia Pacific (Melbourne) :- 8b8ea36ab97c280aa8558c57a380353ac7712f01f82c21598afbb17e188b9ad5
Asia Pacific (Osaka) :- 40f22ffd22d6db3b71544ed6cd00c8952d8b0a63a87d58d5b074ec60397db8c9
Asia Pacific (Thailand) :- d011fe83abcc227a7ac0f914ce411d3630c4ef735e92e88ce0aa796dcfecfbdd
Canada West (Calgary) :- 78e12f8d798f89502177975c4ccdac686c583765cea2bf06e9b34224e2953c83
Europe (Milan) :- 04636d9a349e458b0c1cbf1421858b9788b4ec28b066148d4907bb15c52b5b9c
Europe (Spain) :- 6e81c4c52a37a7f59e103625162ed97bcd0e646593adb107d21310d093151518
Europe (Zurich) :- 5d9fcea77b2fb3df05fc15c893f212ae1d02adb4b24c13e18586db728a48da67
Israel (Tel Aviv) :- 328a78de7561501444823ebeb59152eca7cb58fee2fe2e4223c2cdd9f93ae931
Mexico (Central) :- edaff67fe25d544b855bd0ba9a74a99a2584ab89ceda0a9661bdbeca530d0fca
Middle East (Bahrain) :- aa763f2cf70006650562c62a09433f04353db3cba6ba6aeb3550fdc8065d3d9f
Middle East (UAE) :- 7d3018832562b7b6c126f5832211fae90bd3eee3ed3afde192d990690267e475
China (Beijing) and China (Ningxia) :- 834bafd86b15b6ca71074df0fd1f93d234b9d5e848a2cb31f880c149003ce36f
AWS GovCloud (US) :- af913ca13efe7a94b88392711f6cfc8aa07c9d1454d4f190a624b126733a5602
All other Regions :- c4d8eabf8db69dbe46bfe0e517100c554f01200b104d59cd408e777ba442a322
For each Grantee, provide the following permissions:
READ_ACP (In the Amazon S3 console, Bucket ACL should have the Read permission)
WRITE (In the Amazon S3 console, Objects should have the Write permission)
To monitor the progress of the export, below command can be issued
aws ec2 describe-export-tasks
Sample output
{
"ExportTasks": [
{
"ExportTaskId": "export-i-0d1502ebc7c8e3f1t",
"ExportToS3Task": {
"ContainerFormat": "ova",
"DiskImageFormat": "VMDK",
"S3Bucket": "vhd-19-feb-2025",
"S3Key": "export-i-0d1502ebc7c8e3f1t.ova"
},
"InstanceExportDetails": {
"InstanceId": "i-04fec7bffe50798fd",
"TargetEnvironment": "vmware"
},
"State": "active"
}
]
}
Once the export is completed, the State should be displayed as Completed.
Once the .ova file is in S3, copy it to your local machine using below cli command
aws s3 cp s3://bucket_name/file_name.ova .
Then upload this .ova file to GCP Cloud Storage using the console or command line.
Importing the exported disk to Compute Engine
Before starting the import process, the service account ( service-HOST_PROJECT_NUMBER@gcp-sa-vmmigration.iam.gserviceaccount.com
) must have following permissions :-
Storage Object Viewer :- This permission lets Migrate to Virtual Machines access the source image.
VM Migration Service Account :- This permission allows Migrate to Virtual Machines service account in the host project to be able to create the image in the target project.
On the GCP go to Virtual Machines -> Machine images -> IMPORT MACHINE IMAGE -> BEGIN IMPORT
It will go to Migrate to Virtual Machines.
Give a Name.
Provide the .ova from the Cloud Storage.
Choose Region.
Choose Project. If not showing click on Manage Targets and add your desired project, and then again select the project.
Keep the option Auto-select Compute Engine Machine type checked so GCP will automatically select the optimized Compute Engine type.
Leave all other options as default.
It will start importing the .ova file from Cloud Storage to machine image which can then be used to create an instance.
Migrating additional disks
If there are any additional disks, copy all the data at which the disks are mounted to your local, unmount the disk, remove any entries in /etc/fstab
.
Then in the migrated cloud, create additional disks, attach it to the VM, mount them ( preferably same location ) and copy all the data respective to that disk.
Subscribe to my newsletter
Read articles from Pushkar Laulkar directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Pushkar Laulkar
Pushkar Laulkar
IT Systems Administrator working in Cyber Security domain. Primarily responsible for provisioning virtual infrastructure on VMware & AWS, further enhancing the process by automating through Ansible & Terraform reducing provisioning time by 60%. Deployed various Docker Compose based softwares. Performed initial admin configuration adhering to organizational policies & requirements, setting up ActiveDirectory authentication mechanism. Deployed Securden Privileged Access Management (PAM) on Windows systems to securely store & manage privileged account credentials. Configured secure vaults to centralize & protect sensitive credentials. Conducted basic setup, ensuring compliance with organizational security policies. Zabbix & Kubernetes Monitoring Deployed & configured Zabbix agents on 8 Kubernetes clusters across Dev, Stage, & Prod environments, enabling proactive monitoring of 10+ critical applications. Monitored K8s nodes, pods, Docker containers, VMs, & VMware infrastructure, ensuring 99.9% uptime with automated alerts. Set up uptime checks, SSL certificate expiry monitoring with auto-renewal, & auto-discovery for VMware environments, reducing manual intervention by 80%. Configured critical incident alerts via PushOver, improving response time by 30%, leading to faster issue resolution & minimal downtime. Developed automation using Python & Shell scripting by identifying potential use cases. Addressed & resolved issues reported by teams using the above softwares. Regular software, OS updates & patches to keep the systems up to date while minimizing downtime. Documented software installation, upgrade procedures, steps to resolve most common & frequent issues. Worked on pfSense, adding aliases, DNS entries, implementing network rules adhering to the least privilege policy. Skilled in AWS ( SysOps Admin Associate ), Azure ( Certified Azure Fundamentals ), GCP ( Certified Cloud Digital Leader ), Linux, Python & Shell Programming. Rich experience & troubleshooting experience in Container technology like Docker ( DCA ), Kubernetes ( CKA ). Worked with CI/CD ( Jenkins, Git, IaC ( Ansible, Terraform ( HashiCorp Certified Terraform Associate ) ) ). Strong engineering professional with a Master's degree focused in Software Systems from BITS Pilani Dubai.