Ultimate Guide to Building a Secure DevSecOps Pipeline for FinTech on Azure

Table of contents
- 1️⃣Tools and Technologies
- Learning Objectives
- 2️⃣ Setting Up the Development Environment
- Prerequisites
- 📁 Repository Structure
- 📄 app/main.py
- 📄 app/models.py
- 📄 app/database.py
- main.tf
- 3️⃣ Containerization and GitHub Integration
- 4️⃣ GitHub Actions CI/CD Pipeline
- 5️⃣ Security Integrations
- 6️⃣ Infrastructure as Code + Secrets Management
- 7️⃣ Monitoring and Compliance
- Now let’s deploy the application:
- Step 1: Verify Your Project Structure
- Step 2: Test Locally First.
- Step 3: Test Docker Build Locally
- Phase 2: Azure Setup
- Phase 3: Infrastructure Deployment with Terraform
- Phase 4: GitHub Repository Setup
- Phase 5: SonarCloud Setup
- Phase 6: First Deployment
- Phase 7: Verification & Testing
- 📊 Phase 8: Monitoring & Maintenance

In the FinTech sector, security, compliance, and reliability are non negotiable. As regulations tighten and cyber threats evolve, startups or various companies must adopt DevSecOps from day one. This guide walks you through building a secure, end to end DevSecOps CI/CD pipeline using Azure, tailored for FinTech applications built with a microservices architecture and deployed to Azure Kubernetes Service (AKS).
Develop a production grade CI/CD pipeline for a FinTech startup that ensures:
Security at every stage of development
Fast, automated deployments
Compliance with industry regulations
Real-time monitoring and threat detection
1️⃣Tools and Technologies
Area | Tool |
Version Control | Git (GitHub) |
CI/CD | GitHub Actions |
Containerization | Docker |
Cloud Platform | Azure (AKS, Azure Monitor, Key Vault) |
Infrastructure as Code | Terraform |
Static Analysis | SonarCloud |
Container Scanning | Trivy |
Dynamic Testing | OWASP ZAP |
Secrets Management | Azure Key Vault |
Policy as Code | Checkov |
Learning Objectives
Understand DevSecOps principles in a FinTech context
Build a secure CI/CD pipeline on Azure using GitHub Actions
Integrate automated security checks into the pipeline
Deploy containerized microservices to Azure Kubernetes Service
Monitor compliance and vulnerabilities continuously
2️⃣ Setting Up the Development Environment
Prerequisites
Azure subscription with appropriate permissions
GitHub account
Docker Desktop installed
Azure CLI and kubectl installed
Terraform >= 1.0
Python 3.11+
📁 Repository Structure
secure-fintech-devsecops/
│
├── app/
│ ├── main.py
│ ├── models.py
│ └── database.py
│
├── .github/
│ └── workflows/
│ └── devsecops-pipeline.yml
│
├── k8s/
│ ├── deployment.yaml
│ └── service.yaml
│
├── terraform/
│ └── main.tf
│
├── Dockerfile
├── requirements.txt
└── README.md
Step 1: Provision Azure AKS using Terraform
Directory structure:
📄 app/main.py
pythonfrom fastapi import FastAPI, Depends, HTTPException
from sqlalchemy.orm import Session
from . import models, database
from pydantic import BaseModel
from typing import List
import hashlib
import os
app = FastAPI(title="FinTech API", version="1.0.0")
# Create tables
models.Base.metadata.create_all(bind=database.engine)
class UserCreate(BaseModel):
username: str
password: str
class UserResponse(BaseModel):
id: int
username: str
class Config:
from_attributes = True
class TransactionCreate(BaseModel):
amount: float
description: str
user_id: int
class TransactionResponse(BaseModel):
id: int
amount: float
description: str
user_id: int
class Config:
from_attributes = True
def hash_password(password: str) -> str:
"""Hash password using SHA-256 (use bcrypt in production)"""
return hashlib.sha256(password.encode()).hexdigest()
@app.get("/")
def read_root():
return {"message": "FinTech API is running"}
@app.post("/register", response_model=UserResponse)
def register(user: UserCreate, db: Session = Depends(database.get_db)):
# Check if user already exists
existing_user = db.query(models.User).filter(models.User.username == user.username).first()
if existing_user:
raise HTTPException(status_code=400, detail="Username already registered")
# Hash password before storing
hashed_password = hash_password(user.password)
db_user = models.User(username=user.username, password=hashed_password)
db.add(db_user)
db.commit()
db.refresh(db_user)
return db_user
@app.post("/transactions", response_model=TransactionResponse)
def create_transaction(tx: TransactionCreate, db: Session = Depends(database.get_db)):
# Validate user exists
user = db.query(models.User).filter(models.User.id == tx.user_id).first()
if not user:
raise HTTPException(status_code=404, detail="User not found")
db_tx = models.Transaction(amount=tx.amount, description=tx.description, user_id=tx.user_id)
db.add(db_tx)
db.commit()
db.refresh(db_tx)
return db_tx
@app.get("/transactions", response_model=List[TransactionResponse])
def get_transactions(db: Session = Depends(database.get_db)):
return db.query(models.Transaction).all()
@app.get("/users/{user_id}/transactions", response_model=List[TransactionResponse])
def get_user_transactions(user_id: int, db: Session = Depends(database.get_db)):
user = db.query(models.User).filter(models.User.id == user_id).first()
if not user:
raise HTTPException(status_code=404, detail="User not found")
return db.query(models.Transaction).filter(models.Transaction.user_id == user_id).all()
This Python FastAPI code implements a FinTech API with user registration and transaction management:
Setup: Defines a FastAPI app with SQLAlchemy for database operations and Pydantic for data validation.
User Registration: POST /register creates a user with a hashed password (SHA-256) if the username is unique.
Transaction Creation: POST /transactions records a transaction (amount, description) for an existing user.
Transaction Retrieval: GET /transactions fetches all transactions; GET /users/{user_id}/transactions gets transactions for a specific user.
Database: Uses SQLAlchemy to manage User and Transaction tables, with basic error handling for user existence.
📄 app/models.py
pythonfrom sqlalchemy import Column, Integer, String, Float, ForeignKey, DateTime
from sqlalchemy.orm import relationship
from datetime import datetime
from .database import Base
class User(Base):
__tablename__ = "users"
id = Column(Integer, primary_key=True, index=True)
username = Column(String(50), unique=True, index=True, nullable=False)
password = Column(String(255), nullable=False)
created_at = Column(DateTime, default=datetime.utcnow)
# Relationship to transactions
transactions = relationship("Transaction", back_populates="user")
class Transaction(Base):
__tablename__ = "transactions"
id = Column(Integer, primary_key=True, index=True)
amount = Column(Float, nullable=False)
description = Column(String(255))
user_id = Column(Integer, ForeignKey("users.id"), nullable=False)
created_at = Column(DateTime, default=datetime.utcnow)
# Relationship to user
user = relationship("User", back_populates="transactions")
This Python code defines SQLAlchemy models for a FinTech API's database schema:
User Model:
Table: users
Columns: id (integer, primary key), username (unique string, max 50 chars), password (string, max 255 chars), created_at (datetime, defaults to current UTC time).
Relationship: One-to-many with Transaction (a user can have multiple transactions).
Transaction Model:
Table: transactions
Columns: id (integer, primary key), amount (float), description (string, max 255 chars), user_id (integer, foreign key to users.id), created_at (datetime, defaults to current UTC time).
Relationship: Many-to-one with User (each transaction belongs to one user).
The models establish a relational database structure for managing users and their transactions, with automatic timestamping and referential integrity via foreign keys.
📄 app/database.py
pythonfrom sqlalchemy import create_engine
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import sessionmaker
import os
# Use environment variable for database URL in production
DATABASE_URL = os.getenv("DATABASE_URL", "sqlite:///./fintech.db")
if DATABASE_URL.startswith("sqlite"):
engine = create_engine(DATABASE_URL, connect_args={"check_same_thread": False})
else:
engine = create_engine(DATABASE_URL)
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
Base = declarative_base()
def get_db():
db = SessionLocal()
try:
yield db
finally:
db.close()
This Python code sets up a database connection for the FinTech API using SQLAlchemy:
Database URL: Retrieves DATABASE_URL from environment variables, defaulting to a local SQLite database (fintech.db).
Engine Creation: Creates a SQLAlchemy engine, enabling SQLite multi-threading with check_same_thread: False if SQLite is used.
Session Factory: Configures a SessionLocal sessionmaker for creating database sessions with autocommit and autoflush disabled.
Base Class: Defines a Base class using declarative_base() for SQLAlchemy model definitions.
Dependency Injection: Provides a get_db generator function to yield a database session and ensure it closes after use.
This code enables scalable database interactions for the API, supporting both SQLite for development and other databases (e.g., PostgreSQL) in production.
main.tf
terraform {
required_providers {
azurerm = {
source = "hashicorp/azurerm"
version = "~>3.0"
}
random = {
source = "hashicorp/random"
version = "~>3.0"
}
}
}
provider "azurerm" {
features {}
}
# Resource Group
resource "azurerm_resource_group" "rg" {
name = "fintech-rg"
location = "East US"
tags = {
environment = "production"
project = "fintech-app"
}
}
# Virtual Network
resource "azurerm_virtual_network" "vnet" {
name = "fintech-vnet"
address_space = ["10.0.0.0/16"]
location = azurerm_resource_group.rg.location
resource_group_name = azurerm_resource_group.rg.name
}
# Subnet for general use
resource "azurerm_subnet" "subnet" {
name = "fintech-subnet"
resource_group_name = azurerm_resource_group.rg.name
virtual_network_name = azurerm_virtual_network.vnet.name
address_prefixes = ["10.0.1.0/24"]
}
# Subnet for AKS
resource "azurerm_subnet" "aks_subnet" {
name = "aks-subnet"
resource_group_name = azurerm_resource_group.rg.name
virtual_network_name = azurerm_virtual_network.vnet.name
address_prefixes = ["10.0.2.0/24"]
}
# Log Analytics Workspace
resource "azurerm_log_analytics_workspace" "law" {
name = "fintech-law"
location = azurerm_resource_group.rg.location
resource_group_name = azurerm_resource_group.rg.name
sku = "PerGB2018"
retention_in_days = 30
}
# Azure Container Registry
resource "random_integer" "suffix" {
min = 1000
max = 9999
}
resource "azurerm_container_registry" "acr" {
name = "fintechacr${random_integer.suffix.result}"
resource_group_name = azurerm_resource_group.rg.name
location = azurerm_resource_group.rg.location
sku = "Basic"
admin_enabled = false
}
# AKS Cluster
resource "azurerm_kubernetes_cluster" "aks" {
name = "fintech-aks"
location = azurerm_resource_group.rg.location
resource_group_name = azurerm_resource_group.rg.name
dns_prefix = "fintech-aks"
default_node_pool {
name = "default"
node_count = 2
vm_size = "Standard_DS2_v2"
vnet_subnet_id = azurerm_subnet.aks_subnet.id
enable_auto_scaling = true
min_count = 1
max_count = 3
}
identity {
type = "SystemAssigned"
}
network_profile {
network_plugin = "azure"
network_policy = "azure"
service_cidr = "10.0.3.0/24" # Changed to avoid overlap with subnet
dns_service_ip = "10.0.3.10"
docker_bridge_cidr = "172.17.0.1/16"
}
oms_agent {
log_analytics_workspace_id = azurerm_log_analytics_workspace.law.id
}
tags = {
environment = "production"
project = "fintech-app"
}
}
# Role assignment for AKS to pull images from ACR
resource "azurerm_role_assignment" "aks_acr_pull" {
principal_id = azurerm_kubernetes_cluster.aks.kubelet_identity[0].object_id
role_definition_name = "AcrPull"
scope = azurerm_container_registry.acr.id
skip_service_principal_aad_check = true
}
# Outputs
output "kube_config" {
value = azurerm_kubernetes_cluster.aks.kube_config_raw
sensitive = true
}
output "acr_login_server" {
value = azurerm_container_registry.acr.login_server
}
This Terraform code provisions Azure infrastructure for the FinTech API:
Providers: Configures azurerm (~>3.0) and random (~>3.0) providers for Azure resources and random number generation.
Resource Group: Creates fintech-rg in East US with production and project tags.
Virtual Network: Sets up fintech-vnet (10.0.0.0/16) with two subnets: fintech-subnet (10.0.1.0/24) and aks-subnet (10.0.2.0/24).
Log Analytics: Provisions fintech-law workspace for monitoring with 30-day retention.
Container Registry: Creates fintechacr with a random suffix, Basic SKU, for storing container images.
AKS Cluster: Deploys fintech-aks with a 2-node pool (Standard_DS2_v2), auto-scaling (1-3 nodes), Azure CNI, and Log Analytics integration.
Role Assignment: Grants AKS AcrPull access to the container registry.
Outputs: Exposes AKS kubeconfig (sensitive) and ACR login server URL.
3️⃣ Containerization and GitHub Integration
Dockerfile (Sample for a Python microservice)
FROM python:3.11-slim
# Set environment variables
ENV PYTHONDONTWRITEBYTECODE=1
ENV PYTHONUNBUFFERED=1
# Set work directory
WORKDIR /app
# Install system dependencies
RUN apt-get update && apt-get install -y \
gcc \
&& rm -rf /var/lib/apt/lists/*
# Copy requirements first for better caching
COPY requirements.txt .
# Install Python dependencies
RUN pip install --no-cache-dir --upgrade pip && \
pip install --no-cache-dir -r requirements.txt
# Copy application code
COPY ./app ./app
# Create non-root user for security
RUN adduser --disabled-password --gecos '' appuser && \
chown -R appuser:appuser /app
USER appuser
# Expose port
EXPOSE 8000
# Health check
HEALTHCHECK --interval=30s --timeout=10s --start-period=5s --retries=3 \
CMD curl -f http://localhost:8000/ || exit 1
# Run the application
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]
This Dockerfile builds a container for the FinTech API:
Base Image: Uses python:3.11-slim for a lightweight Python 3.11 environment.
Environment Variables: Sets PYTHONDONTWRITEBYTECODE=1 to avoid .pyc files and PYTHONUNBUFFERED=1 for unbuffered output.
Work Directory: Sets /app as the working directory.
System Dependencies: Installs gcc for compilation and cleans up apt cache.
Python Dependencies: Copies requirements.txt, upgrades pip, and installs dependencies without caching.
Application Code: Copies the app directory into the container.
Security: Creates a non-root user appuser and sets ownership of /app for security.
Port: Exposes port 8000 for the application.
Health Check: Checks API health every 30s by curling http://localhost:8000/, with a 10s timeout and 3 retries.
Run Command: Starts the FastAPI app with uvicorn on 0.0.0.0:8000.
4️⃣ GitHub Actions CI/CD Pipeline
.github/workflows/devsecops-pipeline.yml
name: DevSecOps Pipeline
on:
push:
branches: [ "main", "develop" ]
pull_request:
branches: [ "main" ]
env:
REGISTRY: ghcr.io
IMAGE_NAME: fintech-app
jobs:
sonarcloud:
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v4
with:
fetch-depth: 0 # Shallow clones should be disabled for better analysis
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.11'
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install pytest pytest-cov httpx
- name: Run tests with coverage
run: |
pytest --cov=app --cov-report=xml --cov-report=html
- name: SonarCloud Scan
uses: SonarSource/sonarcloud-github-action@master
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
- name: Run Bandit Security Scan
run: |
pip install bandit
bandit -r app/ -f json -o bandit-report.json || true
- name: Upload Bandit Report
uses: actions/upload-artifact@v3
with:
name: bandit-report
path: bandit-report.json
build-and-test:
runs-on: ubuntu-latest
needs: sonarcloud
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.11'
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install pytest pytest-cov httpx
- name: Run tests
run: |
pytest --cov=app --cov-report=xml --cov-report=html
- name: Build Docker image
run: |
docker build -t ${{ env.IMAGE_NAME }}:${{ github.sha }} .
docker tag ${{ env.IMAGE_NAME }}:${{ github.sha }} ${{ env.IMAGE_NAME }}:latest
- name: Run Trivy vulnerability scanner
uses: aquasecurity/trivy-action@master
with:
image-ref: '${{ env.IMAGE_NAME }}:latest'
format: 'sarif'
output: 'trivy-results.sarif'
- name: Upload Trivy scan results
uses: github/codeql-action/upload-sarif@v2
if: always()
with:
sarif_file: 'trivy-results.sarif'
- name: Test Docker container
run: |
docker run -d --name test-container -p 8000:8000 ${{ env.IMAGE_NAME }}:latest
sleep 10
curl -f http://localhost:8000/ || exit 1
docker stop test-container
docker rm test-container
deploy:
runs-on: ubuntu-latest
needs: [sonarcloud, build-and-test]
if: github.ref == 'refs/heads/main'
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Azure Login
uses: azure/login@v1
with:
creds: ${{ secrets.AZURE_CREDENTIALS }}
- name: Get AKS credentials
run: |
az aks get-credentials --resource-group fintech-rg --name fintech-aks
- name: Deploy to AKS
run: |
kubectl apply -f k8s/
kubectl rollout status deployment/fintech-app
This GitHub Actions workflow, named DevSecOps Pipeline, automates testing, security scanning, building, and deploying the FinTech API. It triggers on push or pull requests to main and develop branches.
Jobs and Steps
sonarcloud (Runs on ubuntu-latest):
Checkout Code: Clones repository with full history.
Set up Python: Installs Python 3.11.
Install Dependencies: Upgrades pip, installs requirements.txt, and adds pytest, pytest-cov, httpx.
Run Tests: Executes pytest with coverage, generating XML and HTML reports.
SonarCloud Scan: Runs SonarCloud analysis using GITHUB_TOKEN and SONAR_TOKEN.
Bandit Scan: Runs Bandit for Python security issues, outputs JSON report.
Upload Bandit Report: Saves bandit-report.json as an artifact.
build-and-test (Runs on ubuntu-latest, depends on sonarcloud):
Checkout Code: Clones repository.
Set up Python: Installs Python 3.11.
Install Dependencies: Same as sonarcloud job.
Run Tests: Executes pytest with coverage.
Build Docker Image: Builds and tags Docker image as fintech-app:<sha> and latest.
Trivy Scan: Scans Docker image for vulnerabilities, outputs SARIF report.
Upload Trivy Results: Uploads trivy-results.sarif to GitHub.
Test Container: Runs container, checks health via curl on http://localhost:8000/, then stops and removes it.
deploy (Runs on ubuntu-latest, depends on sonarcloud and build-and-test, only on main branch):
Checkout Code: Clones repository.
Azure Login: Authenticates with Azure using AZURE_CREDENTIALS.
Get AKS Credentials: Fetches credentials for fintech-aks in fintech-rg.
Deploy to AKS: Applies Kubernetes manifests from k8s/ and waits for fintech-app deployment rollout.
Environment Variables
REGISTRY: Set to ghcr.io (GitHub Container Registry).
IMAGE_NAME: Set to fintech-app.
This pipeline integrates SonarCloud for code quality, Trivy for container vulnerability scanning, and ensures secure deployment to Azure Kubernetes Service (AKS), aligning with DevSecOps principles.
5️⃣ Security Integrations
Trivy for Container Scanning
Included in the CI job above.
SonarCloud for SAST
Add configuration file SonarCloud token setup.
OWASP ZAP for DAST
DAST can be run against a staging endpoint post-deployment.
6️⃣ Infrastructure as Code + Secrets Management
Store credentials in Azure Key Vault
Inject them in your pipeline using Azure OIDC and GitHub Actions
Example:
- name: Login to Azure
uses: azure/login@v1
with:
client-id: ${{ secrets.AZURE_CLIENT_ID }}
tenant-id: ${{ secrets.AZURE_TENANT_ID }}
subscription-id: ${{ secrets.AZURE_SUBSCRIPTION_ID }}
- name: Fetch secrets from Key Vault
run: az keyvault secret show --vault-name fintech-vault --name db-password
7️⃣ Monitoring and Compliance
Use Azure Monitor and Container Insights to monitor AKS.
Enable Azure Policy for security baselines and compliance scanning.
az aks enable-addons \
--addons monitoring \
--name fintech-aks-cluster \
--resource-group fintech-devsecops-rg
Now let’s deploy the application:
Step 1: Verify Your Project Structure
# In VS Code terminal, verify your folder structure
tree
# Should show:
# secure-fintech-devsecops/
# ├── app/
# │ ├── main.py
# │ ├── models.py
# │ └── database.py
# ├── .github/workflows/
# │ └── devsecops-pipeline.yml
# ├── k8s/
# │ ├── deployment.yaml
# │ └── service.yaml
# ├── terraform/
# │ └── main.tf
# ├── Dockerfile
# ├── requirements.txt
# ├── sonar-project.properties
# └── README.md
Step 2: Test Locally First.
copy and paste each command on your terminal step by step
# Create virtual environment by
python -m venv venv
# Activate virtual environment:-
# for Windows:
venv\Scripts\activate
# for macOS/Linux:
source venv/bin/activate
# Install dependencies
pip install -r requirements.txt
# Test the application
uvicorn app.main:app --reload
🌐 Open browser: http://localhost:8000/docs
Test user registration
Create a transaction
Verify everything works
Step 3: Test Docker Build Locally
bash# Build Docker image
docker build -t fintech-app:local .
# Test Docker container
docker run -p 8000:8000 fintech-app:local
# Test in browser again: http://localhost:8000/docs
# Stop container: Ctrl+C
Phase 2: Azure Setup
Step 4: Login to Azure
bash# Login to Azure
az login
# Set your subscription
az account list --output table
# Verify login
az account show
Step 5: Create Azure Service Principal for GitHub Actions
# Create service principal (replace with your subscription ID)
az ad sp create-for-rbac --name "fintech-github-actions" \
--role contributor \
--scopes /subscriptions/YOUR_SUBSCRIPTION_ID \
--sdk-auth
# SAVE THE OUTPUT - you'll need it for GitHub secrets!
# Output looks like:
# {
# "clientId": "xxx",
# "clientSecret": "xxx",
# "subscriptionId": "xxx",
# "tenantId": "xxx"
# }
Phase 3: Infrastructure Deployment with Terraform
Step 6: Initialize and Deploy Infrastructure
bash# Navigate to terraform directory
cd terraform
# Initialize Terraform
terraform init
# Plan the deployment (review what will be created)
terraform plan
# Apply the configuration (type 'yes' when prompted)
terraform apply
# 📋 SAVE THE OUTPUTS:
# - ACR login server
# - Kube config (for connecting to AKS)
Step 7: Connect to Your AKS Cluster
# Get AKS credentials
az aks get-credentials --resource-group fintech-rg --name fintech-aks
# Verify connection
kubectl get nodes
# You should see your AKS nodes listed
Phase 4: GitHub Repository Setup
Step 8: Create GitHub Repository
# Initialize git in your project folder
git init
# Add all files
git add .
# Commit
git commit -m "Initial commit: Secure FinTech DevSecOps Pipeline"
# Create repository on GitHub (via web or CLI)
# Then add remote origin
git remote add origin https://github.com/YOUR_USERNAME/secure-fintech-devsecops.git
# Push to GitHub
git push -u origin main
Step 9: Setup GitHub Secrets
Go to your GitHub repository → Settings → Secrets and variables → Actions
Add these secrets:
Name: AZURE_CREDENTIALS
Value: [Paste the entire JSON output from Step 5]
Name: SONAR_TOKEN
Value: [Get from SonarCloud - see Step 11]
Phase 5: SonarCloud Setup
Step 10: SonarCloud Configuration
Go to SonarCloud.io
Sign up/Login with GitHub
Click "Import an organization from GitHub"
Select your repository
Get your organization key and project key from dashboard
Step 11: Update Configuration Files
# Edit sonar-project.properties
# Replace these values:
sonar.organization=your-organization-key # From SonarCloud
sonar.projectKey=your-project-key # From SonarCloud
Step 12: Get SonarCloud Token
In SonarCloud: Account → My Account → Security
Generate token → Copy it
Add to GitHub secrets as
SONAR_TOKEN
Phase 6: First Deployment
Step 13: Trigger the Pipeline
# Make a small change to trigger pipeline
echo "# Deploy $(date)" >> README.md
# Commit and push
git add .
git commit -m "Trigger initial deployment"
git push origin main
Step 14: Monitor the Deployment
Go to GitHub → Actions tab
Watch your pipeline run:
✅ SonarCloud analysis
✅ Build and test
✅ Security scans
✅ Deploy to AKS
Step 15: Get Your Application URL
bash# Wait for deployment to complete, then get service external IP
kubectl get services
# Look for fintech-service with EXTERNAL-IP
# It might show <pending> initially - wait a few minutes
# Once you have external IP:
# Your app will be available at: http://EXTERNAL-IP/docs
Phase 7: Verification & Testing
Step 16: Test Production Deployment
Open browser:
http://YOUR-EXTERNAL-IP/docs
Test user registration:
json{ "username": "isaac", "password": "testpass123" }
Verify responses are working
Step 17: Check Kubernetes Resources
bash# Check all resources
kubectl get all
# Check logs
kubectl logs deployment/fintech-app
Step 18: Monitor SonarCloud Results
Go to SonarCloud dashboard
Check your project quality gate
Review security vulnerabilities
Check code coverage percentage
📊 Phase 8: Monitoring & Maintenance
Step 19: Setup Monitoring
# Check Azure Monitor logs
az monitor log-analytics workspace show --resource-group fintech-rg --workspace-name fintech-law
Step 20: Regular Updates
# For future updates:
# 1. Make code changes
# 2. Commit and push
# 3. Pipeline automatically deploys
# 4. Monitor via GitHub Actions and Azure
# Scale your application:
kubectl scale deployment fintech-app --replicas=5
# Update image:
kubectl set image deployment/fintech-app fintech=fintech-app:new-version
Uses SonarCloud
Scans code in CI/CD pipelines to ensure quality before merging.
Provides real-time bug detection in IDEs via SonarLint.
Enforces quality gates to block substandard code merges.
Identifies vulnerabilities in project dependencies.
Supports multi-language analysis for open-source projects.
Uses of Trivy
Scans container images for OS and dependency vulnerabilities.
Detects misconfigurations in Terraform, Kubernetes, and Dockerfiles.
Integrates with CI/CD tools for automated vulnerability reporting.
Scans Git repositories for dependency vulnerabilities.
Converts scan results for SonarQube/Cloud integration.
Subscribe to my newsletter
Read articles from ISAAC DIVINE directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

ISAAC DIVINE
ISAAC DIVINE
🚀 Cloud Enthusiast | Turning ideas into scalable solutions ☁️ | Passionate about shaping the future of tech with cloud computing 🌐 | Always exploring the next big thing in the digital sky ✨"