Azure pipelines for deploying GCP Cloud functions

Table of contents
- Step-by-Step Procedure: Azure Pipeline for GCP Cloud Functions Deployment
- 1️⃣ Create and Configure GCP Service Account
- 2️⃣ Add the Service Account Key to Azure DevOps
- 3️⃣ Create/Configure Self-Hosted Ubuntu Agent
- 4️⃣ Prepare GCP Cloud Function Repository Structure
- 5️⃣ Update Azure Pipeline YAML (azure-pipelines.yml)
- 6️⃣ Push Code & YAML to Repository
- 7️⃣ Create Tag for Deployment
- 8️⃣ Pipeline Trigger & Deployment
- ✅ Summary

Anyone can signup for Azure DevOps using their Microsoft email id. Up to five users can access all core Azure DevOps features for free.
For more information on pricing refer : https://azure.microsoft.com/en-in/pricing/details/devops/azure-devops-services/
Step-by-Step Procedure: Azure Pipeline for GCP Cloud Functions Deployment
1️⃣ Create and Configure GCP Service Account
In Google Cloud Console → IAM & Admin → Service Accounts:
Create a service account with following permissions for Cloud Functions deployment:
Download the JSON key and save it as
sa-azuredevops.json
.
2️⃣ Add the Service Account Key to Azure DevOps
Go to Azure DevOps Project → Pipelines → Library → Secure Files.
Upload
sa-azuredevops.json
for secure use in pipelines.
3️⃣ Create/Configure Self-Hosted Ubuntu Agent
You can either use Microsoft-hosted agents: These are agents managed by Microsoft and readily available for use in Azure Pipelines. They offer various pre-installed software and environments (e.g., Windows, Ubuntu, macOS with different versions of Visual Studio, .NET, Java, Node.js, etc.). They are convenient for quick setup and general-purpose builds and deployments, but offer less control over the environment.
For this demo I have used self hosted ubuntu agent since it provides more control over the environment. I created a Ubuntu 22.04 VM in GCP.
In Organization settings —> Agent Pool —> create a agent pool then click on New agent
It will give you instructions to download and setup the agent follow these:
I have followed these instructions and below are the set of commands:
On your Ubuntu VM:
mkdir myagent && cd myagent
curl -O https://vstsagentpackage.azureedge.net/agent/3.x.x/vsts-agent-linux-x64-<version>.tar.gz
tar zxvf vsts-agent-linux-x64-<version>.tar.gz
./config.sh
./run.sh
sudo ./svc.sh install ## to install as service
./svc.sh start # service start
When you run the config.sh script it will ask you for the azure organization url and then will ask to authenticate your azure devops account via a PAT (Personal Access Token)
To generate the PAT hover over to the top right of Azure Devops Page. For PAT you can restrict the access based on the requirements. For demo purposes I have given full permission to the PAT which is not recommended for production based setups.
Ensure you save this PAT securely since it will be visible only once.
Once you setup the agent you can start the agent
Ensure the agent is online in Azure DevOps → Organization Settings → Agent Pools →
ubuntu-gcp-pool
.When the agent is online you’re good to go.
4️⃣ Prepare GCP Cloud Function Repository Structure
Before writing the pipeline, ensure your repository is structured so the pipeline can locate and deploy functions easily.
Recommended repo layout I have followed:
Guidelines:
Keep all functions under the
functions/
directory.The folder name should match the GCP Cloud Function name.
main.py
should include the Cloud Function handler (e.g.,def hello_http(request):
).requirements.txt
must list any external dependencies.The pipeline can identify the function by using a Git tag format like:
cf/<function-name>-v<version> #Example: cf/func-test-v1.0 tag will deploy cf named func-test and you should have a directory called functions/func-test with all the code
# HTTP Gen2 function def hello_http(request): return ("Hello from Akshata → GCP Cloud Functions Gen2!", 200)
Above is the python code simple for the cloud function. Later we will push the code to Azure Repo
5️⃣ Update Azure Pipeline YAML (azure-pipelines.yml
)
Next we will create our Azure pipeline yaml code
Key features of the pipeline:
Trigger on tags in the format
cf/<function-name>-v<version>
.Clean workspace before checkout to avoid old code.
Checkout repo with full history (
fetchDepth: 0
).Download GCP service account key securely.
Install Google Cloud SDK on the self-hosted agent.
Authenticate using service account.
Extract function name and version from tag.
Pre-deployment folder check to ensure function folder exists.
Deploy Cloud Function (Gen 2, HTTP Trigger, Python 3.11).
YAML:
trigger:
tags:
include:
- cf/* # Trigger on tags like cf/hello-aksh-v1.0
pool:
name: 'ubuntu-gcp-pool'
steps:
# 1️⃣ Clean workspace
- task: DeleteFiles@1
displayName: 'Clean workspace'
inputs:
SourceFolder: '$(System.DefaultWorkingDirectory)'
Contents: '**/*'
RemoveSourceFolder: false
# 2️⃣ Checkout repository
- checkout: self
clean: true
fetchDepth: 0
# 3️⃣ Download GCP service account key
- task: DownloadSecureFile@1
name: gcpKey
inputs:
secureFile: sa-azuredevops.json
# 4️⃣ Deploy Cloud Function
- script: |
set -e
# Install gcloud SDK
sudo apt-get update -y
sudo apt-get install -y apt-transport-https ca-certificates gnupg curl
curl -fsSL https://packages.cloud.google.com/apt/doc/apt-key.gpg \
| sudo gpg --dearmor --yes -o /usr/share/keyrings/cloud.google.gpg
echo "deb [signed-by=/usr/share/keyrings/cloud.google.gpg] http://packages.cloud.google.com/apt cloud-sdk main" \
| sudo tee /etc/apt/sources.list.d/google-cloud-sdk.list
sudo apt-get update -y
sudo apt-get install -y google-cloud-sdk
# Authenticate
gcloud auth activate-service-account --key-file "$(gcpKey.secureFilePath)"
gcloud config set project "x-arcanum-465305-e7"
gcloud config set functions/region "asia-south1"
# Extract function name and version from tag
REF="$(Build.SourceBranch)"
TAG="${REF#refs/tags/cf/}" # e.g., hello-aksh-v1.0
FUNC="${TAG%-v*}" # hello-aksh
VERSION="${TAG##*-v}" # 1.0
echo "Deploying function: $FUNC, version: $VERSION"
# Pre-deployment folder check
if [ ! -d "functions/$FUNC" ]; then
echo "Error: folder functions/$FUNC does not exist. Aborting deployment."
exit 1
fi
# Optional debug
echo "Workspace: $(System.DefaultWorkingDirectory)"
ls -l functions
ls -l functions/$FUNC
# Deploy Cloud Function
gcloud functions deploy "$FUNC" \
--gen2 \
--runtime python311 \
--entry-point hello_http \
--trigger-http \
--allow-unauthenticated \
--source "functions/$FUNC"
displayName: "Deploy Cloud Function to GCP"
6️⃣ Push Code & YAML to Repository
In Azure, Repos page create a repository then follow the instructions to push code from your local repo to Azure Repo. When you do git remote add origin https://dev.azure.com/project/….. It will ask you for credentials to signin which you have used for your azure devops page.
Below are the commands
git add azure-pipelines.yml functions/<your-function-folders>
git commit -m "Add updated pipeline with pre-checks and tag versioning"
git push origin main
- Make sure your branch is up-to-date with remote (
git pull --rebase
) to avoid push conflicts.
7️⃣ Create Tag for Deployment
- Important: Tag must be created on the commit that contains the updated
azure-pipelines.yml
.
git tag cf/hello-aksh-v1.0
git push origin cf/hello-aksh-v1.0
- Pipeline triggers automatically only for that tag.
You can also create push tag from the Azure Devops console:
Go to New tag
In this we can see it is based on main which means the Azure pipelines yaml code present in the main branch will be utilized.
8️⃣ Pipeline Trigger & Deployment
Azure DevOps detects the new tag
cf/<function-name>-v<version>
.Pipeline runs on self-hosted ubuntu-gcp-pool.
Workspace is cleaned → repo is checked out → GCP key downloaded → gcloud SDK installed → authentication done.
Function name and version are extracted from the tag.
Pre-deployment folder check ensures the folder exists.
Cloud Function is deployed to GCP Gen 2 using Python 3.11 runtime.
✅ Summary
Tagging a commit with
cf/<function-name>-v<version>
triggers the pipeline automatically.Supports multiple functions from a single pipeline.
Try this out on your own environments. Happy Learning!
Subscribe to my newsletter
Read articles from Akshata Shenoy directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
