Seamless Multi-Cloud Deployment with Azure DevOps, Terraform, and Ansible

Balraj SinghBalraj Singh
9 min read

Image

Overview

This project shows how to set up multi-cloud automation using Azure DevOps, Terraform, and Ansible. The aim is to automate the deployment of a .NET application across both AWS and Azure environments. The project is divided into four parts, each focusing on different parts of the automation process.

Key Points

  • Azure DevOps is used for the CI/CD workflow.

  • Terraform is used to provision infrastructure on AWS and Azure.

  • Ansible is used for configuration management and application deployment.

  • Remote State File is stored in Azure Blob Storage to maintain the state of the infrastructure.

  • Self-hosted agent is used to overcome the limitations of ephemeral agents provided by Azure DevOps.

Prerequisites

Before diving into this project, here are some skills and tools you should be familiar with:

Tools and Technologies

  • Azure DevOps: For CI/CD pipeline management.

  • Terraform: For infrastructure provisioning.

  • Ansible: For configuration management and application deployment.

  • AWS: For provisioning EC2 instances.

  • Azure: For provisioning virtual machines.

  • .NET: For the application being deployed.

  • Ubuntu: As the base operating system for the virtual machines.

Setting Up the Infrastructure

I have created a Terraform code to set up the entire infrastructure, including the installation of required applications, tools, and storage automatically created.

Note--> Selfthosted agent VM will take approx 5 to 10 min to install the all required software/packages.

  • ⇒ Virtual machines will be created named as "devopsdemovm"

  • ⇒ Ansible Install

  • ⇒ Azure CLI Install

  • ⇒ Storage Account & Blob Setup

  • ⇒ .Net Installation

First, we'll create the necessary virtual machines using terraform code.

  • Below is a terraform Code:

  • Once you clone repo and run the terraform command.

      $ ls -l
      -rw-r--r-- 1 bsingh 1049089   573 Feb 19 15:37 aws_connection.tf       
      -rw-r--r-- 1 bsingh 1049089   876 Feb 24 15:57 azure_rm_connection.tf  
      -rw-r--r-- 1 bsingh 1049089   564 Feb 19 13:54 DevOps_UI.tf
      -rw-r--r-- 1 bsingh 1049089   419 Feb 19 13:55 group_lib.tf
      -rw-r--r-- 1 bsingh 1049089  3243 Feb 27 10:59 id_rsa
      -rw-r--r-- 1 bsingh 1049089   725 Feb 27 10:59 id_rsa.pub
      -rw-r--r-- 1 bsingh 1049089   769 Feb 20 11:27 output.tf
      -rw-r--r-- 1 bsingh 1049089   528 Feb 18 21:13 provider.tf
      drwxr-xr-x 1 bsingh 1049089     0 Feb 20 15:59 scripts/
      -rw-r--r-- 1 bsingh 1049089  6175 Feb 20 15:57 selfthost_agentvm.tf    
      -rw-r--r-- 1 bsingh 1049089   362 Feb 19 12:35 ssh_key.tf
      -rw-r--r-- 1 bsingh 1049089  1180 Feb 21 12:44 Storage.tf
      -rw-r--r-- 1 bsingh 1049089 72270 Feb 27 11:02 terraform.tfstate       
      -rw-r--r-- 1 bsingh 1049089   183 Feb 27 10:59 terraform.tfstate.backup
      -rw-r--r-- 1 bsingh 1049089  3654 Feb 21 13:13 terraform.tfvars        
      -rw-r--r-- 1 bsingh 1049089  3999 Feb 20 11:05 variable.tf
    
  • You need to run the terraform command.

    • Run the following command.
    terraform init
    terraform fmt
    terraform validate
    terraform plan
    terraform apply 
    # Optional <terraform apply --auto-approve>

Image

Once you run the terraform command, then we will verify the following things to make sure everything is setup via a terraform.

Inspect the Cloud-Init logs:

Once connected to VM then you can check the status of the user_data script by inspecting the log files

# Primary log file for cloud-init
sudo tail -f /var/log/cloud-init-output.log
                    or 
sudo cat /var/log/cloud-init-output.log | more
  • If the user_data script runs successfully, you will see output logs and any errors encountered during execution.

  • If there’s an error, this log will provide clues about what failed.

Image

Verify the Installation

  • [x] Docker version
azureuser@devopsdemovm:~$  docker --version
Docker version 24.0.7, build 24.0.7-0ubuntu4.1


docker ps -a
azureuser@devopsdemovm:~$  docker ps
  • [x] Ansible version
azureuser@devopsdemovm:~$ ansible --version
ansible [core 2.17.8]
  config file = /etc/ansible/ansible.cfg
  configured module search path = ['/home/azureuser/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
  ansible python module location = /usr/lib/python3/dist-packages/ansible
  ansible collection location = /home/azureuser/.ansible/collections:/usr/share/ansible/collections
  executable location = /usr/bin/ansible
  python version = 3.10.12 (main, Feb  4 2025, 14:57:36) [GCC 11.4.0] (/usr/bin/python3)
  jinja version = 3.0.3
  libyaml = True
  • [x] Azure CLI version
azureuser@devopsdemovm:~$ az version
{
  "azure-cli": "2.67.0",
  "azure-cli-core": "2.67.0",
  "azure-cli-telemetry": "1.1.0",
  "extensions": {}
}
  • [x] Project Creation

  • Image

  • [x] Service Connection

  • Image

  • [x] Import Repo

  • Image

  • [x] Resource Group & Storage Account Creation

  • Image

  • Upload the private and public keys in the library from secure files as below.

  • Image

Note: It would be the same keys which was created during the provision the infra.

Step-by-Step Execution

Part 1: Project Overview

  1. Introduction: Overview of the project and its phases.

  2. Architecture: High-level architecture diagram and workflow explanation.

  3. Azure DevOps: Setting up the CI/CD workflow.

Part 2: Build .NET Application

  1. Repository Setup: Structure of the .NET application repository.

  2. Build Pipeline: Steps to build the .NET application using Azure DevOps.

  3. Artifact Publishing: Publishing the build artifact for later use.

Part 3: Terraform Pipeline

  1. Remote State File: Setting up the remote state file in Azure Blob Storage.

  2. Terraform Scripts: Writing Terraform scripts to provision infrastructure on AWS and Azure.

  3. Pipeline Execution: Running the Terraform pipeline to provision the infrastructure.

Part 4: Deploy App using Ansible

  1. Self-Hosted Agent: Setting up and using a self-hosted agent for stable SSH connections.

  2. Dynamic Inventory: Creating a dynamic inventory for Ansible.

  3. Ansible Playbook: Writing and executing Ansible playbooks to deploy the .NET application.

01. Pipeline - Build (Package)

  • Build the packages pipeline first.

    Image

    Image

    Image

    Image

🔔Here is the Updated pipeline code.🔔

  • Build the pipeline status.

  • Image

  • Verify whether the artifact is published or not.

  • Image

  • Rename the pipeline as below, because that name will be used in next pipeline.

      Name: Build-Pipeline
    

    Image

02. Pipeline - Create Infra

  • Create a new pipeline for AWS and Azure infra Setup.

  • We chose the startup pipeline, and the steps are the same as those we followed in the build pipeline.

  • 🔔Here is the Updated pipeline for Infra Setup.🔔

  • Here the pipeline but few parameter need to be adjusted as below.

  • In Terraform init, adjust the connection, storage account etc.

  • Image

  • In Terraform plan, adjust the connection, AWS region etc.

  • Image

    Image

  • In Terraform apply,adjust the connection, AWS region etc.

  • Image

    Image

  • Rename the pipeline as below.

  • Image

  • Run the pipeline

    • It will ask for permission and approve it.

      Image

  • Pipeline Status:

  • Image

Verify the infra setup in both cloud Environments

  • EC2 in AWS

    Image

  • VMs in Azure

    Image

Selfhosted Agent setup

  • Will take a putty session of self-hosted agent VM and run the following command-

  • Verify agent status

    Image

  • if an agent is not visible then run the following command to Register the selfhost agent.

    • Run the following code to register a selfhost agent.

        ./config.sh --unattended --url https://dev.azure.com/<organiazation Name> --auth pat --token <token value> --pool Default --agent <agentname> --acceptTeeEula
        sudo ./svc.sh install
        sudo ./svc.sh start
      

03. Pipeline - Application with Ansible setup

  • login into Azure DevOps portal and go to folder Selfhosted-Ansible and update as below.

  • File add_to_known_hosts.py

    • changes the script as below.

      Image

      Image

  • File fetch_state_file.py

    • SAS token needs to be generated from blob.

    • How to Generate SAS token

      Image

      Image

      Image

    • Update the SAS token into the same file as below.

      Image

  • Now, we have to upload the private key in self-hosted agent under the directory (/home/azureuser/.ssh)

    Image

    • Change the permission.
    chmod 600 id_rsa

Image

  • Create a new pipeline (Application).

  • Adjust the parameters.

    • Select the right project

    • Select the build Pipeline that generated the artifact.

      Image

  • Run the pipeline and it will ask for permission.

    Image

Troubleshooting

  • The pipeline failed because I encountered issues related to the environment and Python.

    Image

Based on the error message, it seems there are a few issues:
1. The virtual environment `myenv` is not found.
2. The `python` command is not found.
3. The dynamic inventory file `dynamic_inventory.json` is not being parsed correctly.

Additionally, created a requirements.txt file in the /multi-cloud-project/Selfhosted-Ansible directory to install necessary Python packages.

  • Create a new file to list the required Python packages.
azure-storage-blob
ansible
cd /home/azureuser/myagent/_work/ansible-files

azureuser@devopsdemovm:~/myagent/_work/ansible-files$ ls -la
total 88
drwxr-xr-x  2 azureuser azureuser  4096 Feb 27 00:59 .
drwxr-xr-x 10 azureuser azureuser  4096 Feb 27 00:28 ..
-rw-r--r--  1 azureuser azureuser   332 Feb 27 00:59 Proj1.service
-rw-r--r--  1 azureuser azureuser   887 Feb 27 00:59 add_to_known_hosts.py
-rw-r--r--  1 azureuser azureuser  2123 Feb 27 00:59 app-deploy-playbook.yaml
-rw-r--r--  1 azureuser azureuser   613 Feb 27 00:59 dynamic_inventory.json
-rw-r--r--  1 azureuser azureuser   886 Feb 27 00:59 fetch_state_file.py
-rw-r--r--  1 azureuser azureuser  1212 Feb 27 00:59 parse_ips_from_state.py
-rw-r--r--  1 azureuser azureuser   122 Feb 27 00:59 ping-playbook.yaml
-rw-r--r--  1 azureuser azureuser    29 Feb 27 00:59 requirements.txt
-rw-r--r--  1 azureuser azureuser 47537 Feb 27 00:59 state_file.tfstate
ansible-inventory --list -i dynamic_inventory.json

Image

  • Validate the JSON format:
cat dynamic_inventory.json | jq .

Image

All Pipeline Status

  • All pipelines are working fine.

    Image

  • Verify application accessibility

Congratulations, Application is accessible.🚀

04. Pipeline for Cleanup Infra Setup.

Environment Cleanup:

  • As we are using Terraform, we will use the following command to delete ssh_key and Storage account.

  • Run the terraform command.

      Terraform destroy --auto-approve
    

Image

Challenges

  • Ephemeral Agents: Using Microsoft-hosted agents posed challenges due to their ephemeral nature, making SSH connections difficult.

  • State Management: Managing the state file in a multi-cloud environment requires careful handling to ensure consistency.

  • Dynamic Inventory: Creating a dynamic inventory for Ansible to manage instances across both AWS and Azure.

Benefits

  • Automation: Reduces manual effort and potential errors in deploying and managing infrastructure.

  • Scalability: Easily scalable to manage more instances or additional cloud providers.

  • Consistency: Ensures consistent deployment and configuration across different environments.

  • Learning: Provides valuable insights into multi-cloud management and automation tools.

Conclusion

This project showcases the power of automation in managing multi-cloud environments. By leveraging Azure DevOps, Terraform, and Ansible, we can achieve consistent and scalable deployments across AWS and Azure. The challenges faced during the project provided valuable learning experiences, and the benefits of automation were clearly demonstrated. This setup can be further enhanced with additional features and optimizations in the future.

Ref Link:

0
Subscribe to my newsletter

Read articles from Balraj Singh directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Balraj Singh
Balraj Singh

Tech enthusiast with 15 years of experience in IT, specializing in server management, VMware, AWS, Azure, and automation. Passionate about DevOps, cloud, and modern infrastructure tools like Terraform, Ansible, Packer, Jenkins, Docker, Kubernetes, and Azure DevOps. Passionate about technology and continuous learning, I enjoy sharing my knowledge and insights through blogging and real-world experiences to help the tech community grow!