Day 66 of 90 Days of DevOps Challenge: OS-specific Tasks, Error Handling & AWS S3 Automation

On Day 65, I explored Ansible Playbooks in detail, focusing on multi-task playbooks, using variables, and introducing basic conditional logic. This laid the groundwork for writing more advanced, dynamic automation scripts tailored to varied environments.
Today, I expanded my Ansible knowledge with practical implementations, covering:
OS-Specific Package Installation
Error Handling in Playbooks
Checking & Installing Maven Conditionally
Automating AWS S3 Bucket Creation via Ansible
Installing Java Based on OS Family
When managing a heterogeneous infrastructure, installing the same package on different OS families requires conditional execution. Here’s a playbook that installs Java on both Red Hat (e.g., Amazon Linux) and Debian-based (e.g., Ubuntu) systems:
---
- hosts: all
gather_facts: yes
tasks:
- name: install java in Red Hat family
yum:
name: java
state: latest
when: ansible_os_family == 'RedHat'
- name: install java in Debian family
apt:
name: java
state: latest
when: ansible_os_family == 'Debian'
This ensures platform-specific package managers are used without manual intervention.
Error Handling in Playbooks
By default, Ansible playbooks halt when a task fails. However, in many scenarios, we want the playbook to continue executing subsequent tasks even if an error occurs. This is where ignore_errors: yes
comes into play.
Example Playbook for Error Handling:
---
- hosts: all
tasks:
- name: This is the first task
command: dates
register: dates_output
ignore_errors: yes
- name: This is the second task
debug:
msg: "Second task executed..."
when: dates_output.rc == 0
- name: This is the third task
debug:
var: dates_output
Here, even if the dates
command fails, the playbook continues executing the subsequent tasks gracefully.
Check and Install Maven on Control Node
A practical requirement is to verify if Maven is installed on the control node and install it if absent. Below is the approach:
---
- hosts: localhost
become: true
tasks:
- name: Check Maven version
command: mvn --version
register: output
ignore_errors: yes
- name: Print Maven check output
debug:
var: output
- name: Install Maven if not present
yum:
name: maven
state: latest
when: output.failed
This conditional installation ensures that Maven is only installed when it is missing.
Automating S3 Bucket Creation Using Ansible
Automating AWS infrastructure components with Ansible is a powerful capability. Here's how to create an S3 bucket step-by-step:
Step 1: Install the required Ansible collection
ansible-galaxy collection install amazon.aws --force
Step 2: Install the boto3
Python package on the control node
ansible localhost -m pip -a "name=boto3 state=present" --become
Step 3: Export AWS credentials
export AWS_ACCESS_KEY_ID='your-access-key-id'
export AWS_SECRET_ACCESS_KEY='your-secret-access-key'
export AWS_DEFAULT_REGION='ap-south-1'
Step 4: Create and run the playbook and Verify the bucket in the AWS console.
---
- hosts: localhost
tasks:
- name: Create S3 bucket
amazon.aws.s3_bucket:
name: your-unique-bucket-name
state: present
region: ap-south-1
register: s3_bucket_info
- name: Print S3 bucket info
debug:
var: s3_bucket_info
Final Thoughts
Today’s session highlighted how Ansible can be used not just for configuration management but also for dynamic, conditional automation based on OS families and system state. Additionally, integrating AWS resource provisioning, such as S3 bucket creation, directly into Ansible playbooks expands its utility in cloud automation.
Moving forward, mastering these playbook patterns will be essential in scaling infrastructure automation and enhancing DevOps workflows.
Subscribe to my newsletter
Read articles from Vaishnavi D directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
