AWS FinOps Dashboard (CLI) v2.2.7


The AWS FinOps Dashboard is an open-source, Python-based command-line tool (built with the Rich library) for AWS cost monitoring. It provides multi-account cost summaries by time period, service, and cost allocation tags; budget limits vs. actuals; EC2 instance status; six‑month cost trend charts; and “FinOps audit” reports (e.g. untagged or idle resources). It can export data to CSV/JSON/PDF.
Why AWS FinOps Dashboard?
Managing and understanding your AWS expenditure, especially across multiple accounts and services, can be complex. The AWS FinOps Dashboard CLI aims to simplify this by providing a clear, concise, and actionable view of your AWS costs and operational hygiene directly in your terminal.
Key benefits include:
Unified View: Consolidate cost and resource data from multiple AWS profiles.
Actionable Insights: Quickly identify spending patterns, untagged resources, and potential savings.
Terminal-First: Access crucial FinOps data without leaving your command line.
Customizable & Extensible: Open-source and built with Python, allowing for easy customization.
Features
Cost Analysis by Time Period:
View current & previous month's spend by default
Set custom time ranges (e.g., 7, 30, 90 days) with
--time-range
option
Cost by AWS Service: Sorted by highest cost for better insights
Cost by Tag: Get the cost data by one or more tags with
--tag
(cost allocation tags must be enabled)AWS Budgets Information: Displays budget limits and actual spend
EC2 Instance Status: Detailed state information across specified/accessible regions
Cost Trend Analysis: View detailed cost trends in bar charts for the last 6 months across AWS profiles
FinOps Audit: View untagged resources, unused or stopped resources, and Budget breaches across AWS profiles. Audit reports are exported in PDF format.
Profile Management:
Automatic profile detection
Specific profile selection with
--profiles
Use all available profiles with
--all
Combine profiles from the same AWS account with
--combine
Region Control: Specify regions for EC2 discovery using
--regions
Export Options:
CSV export with
--report-name
and--report-type csv
JSON export with
--report-name
and--report-type json
Export to both CSV and JSON formats with
--report-name
and--report-type csv json
Specify output directory using
--dir
PDF export with
--report-name
and--report-type pdf
Note: Audit reports (generated via
--audit
) currently only support PDF and JSON export. Other formats specified in--report-type
will be ignored for these reports.
Improved Error Handling: Resilient and user-friendly error messages
Beautiful Terminal UI: Styled with the Rich library for a visually appealing experience
Prerequisites
Python 3.8 or later: Ensure you have the required Python version installed
Amazon Q
AWS CLI configured with named profiles: Set up your AWS CLI profiles for seamless integration
AWS credentials with permissions:
Create IAM user and add the below permissions (optional add: AWSBillingReadOnlyAccess / ReadOnyAccess )and create access and secret key
Readonlyaccess Doc: https://docs.aws.amazon.com/aws-managed-policy/latest/reference/ReadOnlyAccess.html
ce:GetCostAndUsage
budgets:ViewBudget
ec2:DescribeInstances
ec2:DescribeRegions
sts:GetCallerIdentity
ec2:DescribeInstances
ec2:DescribeVolumes
ec2:DescribeAddresses
rds:DescribeDBInstances
rds:ListTagsForResource
lambda:ListFunctions
lambda:ListTags
elbv2:DescribeLoadBalancers
elbv2:DescribeTags
Installation
There are several ways to install the AWS FinOps Dashboard:
Option 1: Recommended — Use a Virtual Environment (Fast Python Package Installer)
This is the safest and cleanest way to install Python packages:
uv is a modern Python package installer and resolver that's extremely fast.
# Install uv if you don't have it yet
curl -LsSf https://astral.sh/uv/install.sh | sh
# Install aws-finops-dashboard
uv pip install aws-finops-dashboard
# Create a virtual environment
uv venv venv
# Activate it
source venv/bin/activate
# Then install the package
uv pip install aws-finops-dashboard
Option 2: Using pipx (Recommended)
sudo yum install python3-pip -y
After that, test it:
pip3 --version
If you want to use pip
instead of pip3
, set an alias:
sudo alternatives --install /usr/bin/pip pip /usr/bin/pip3 1
Optional: Upgrade pip (once it’s installed)
pip install --upgrade pip
To install aws-finops dashboard
pipx install aws-finops-dashboard
Option 3: Using pip
pip install aws-finops-dashboard
Option 4: From Source
# Clone the repository
git clone https://github.com/BalajiiBharath/aws-finops-dashboard.git
cd aws-finops-dashboard
# Install using pip
pip install -e .
AWS CLI Profile Setup
Configure Aws Configure
using access and secret key
If you haven't already, configure your named profiles using the AWS CLI:
aws configure --profile profile1-name
aws configure --profile profile2-name
# ... etc ...
Repeat this for all the profiles you want the dashboard to potentially access.
Command Line Usage
Run the script using aws-finops
followed by options:
aws-finops [options]
Command Line Options
Flag | Description |
--config-file , -C | Path to a TOML, YAML, or JSON configuration file. Command-line arguments will override settings from the config file. |
--profiles , -p | Specific AWS profiles to use (space-separated). If omitted, uses 'default' profile if available, otherwise all profiles. |
--regions , -r | Specific AWS regions to check for EC2 instances (space-separated). If omitted, attempts to check all accessible regions. |
--all , -a | Use all available AWS profiles found in your config. |
--combine , -c | Combine profiles from the same AWS account into single rows. |
--tag , -g | Filter cost data by one or more cost allocation tags in Key=Value format. Example: --tag Team=DevOps Env=Prod |
--report-name , -n | Specify the base name for the report file (without extension). |
--report-type , -y | Specify report types (space-separated): 'csv', 'json', 'pdf'. For reports generated with --audit , only 'pdf' is applicable and other types will be ignored. |
--dir , -d | Directory to save the report file(s) (default: current directory). |
--time-range , -t | Time range for cost data in days (default: current month). Examples: 7, 30, 90. |
--trend | View cost trend analysis for the last 6 months. |
--audit | View list of untagged, unused resoruces and budget breaches. |
Examples
# Use default profile, show output in terminal only
aws-finops
# Use specific profiles 'dev' and 'prod'
aws-finops --profiles dev prod
# Use all available profiles
aws-finops --all
# Combine profiles from the same AWS account
aws-finops --all --combine
# Specify custom regions to check for EC2 instances
aws-finops --regions us-east-1 eu-west-1 ap-southeast-2
# View cost data for the last 30 days instead of current month
aws-finops --time-range 30
# View cost data only for a specific tag (e.g., Team=DevOps)
aws-finops --tag Team=DevOps
# View cost data for multiple tags (e.g., Team=DevOps and Env=Prod)
aws-finops --tag Team=Devops Env=Prod
# Export data to CSV only
aws-finops --all --report-name aws_dashboard_data --report-type csv
# Export data to JSON only
aws-finops --all --report-name aws_dashboard_data --report-type json
# Export data to both CSV and JSON formats simultaneously
aws-finops --all --report-name aws_dashboard_data --report-type csv json
# Export combined data for 'dev' and 'prod' profiles to a specific directory
aws-finops --profiles dev prod --combine --report-name report --report-type csv --dir output_reports
# View cost trend analysis as bar charts for profile 'dev' and 'prod'
aws-finops --profiles dev prod -r us-east-1 --trend
# View cost trend analysis for all cli profiles for a specific cost tag 'Team=DevOps'
aws-finops --all --trend --tag Team=DevOps
# View audit report for profile 'dev' in region 'us-east-1'
aws-finops -p dev -r us-east-1 --audit
# View audit report for profile 'dev' in region 'us-east-1' and export it as a pdf file to current working dir with file name 'Dev_Audit_Report'
aws-finops -p dev -r us-east-1 --audit -n Dev_Audit_Report -y pdf
# Use a configuration file for settings
aws-finops --config-file path/to/your_config.toml
# or
aws-finops -C path/to/your_config.yaml
You'll see a live-updating table of your AWS account cost and usage details in the terminal. If export options are specified, a report file will also be generated upon completion.
Using a Configuration File
Instead of passing all options via the command line, you can use a configuration file in TOML, YAML, or JSON format. Use the --config-file
or -C
option to specify the path to your configuration file.
Command-line arguments will always take precedence over settings defined in the configuration file.
Below are examples of how to structure your configuration file.
TOML Configuration Example (config.toml
)
# config.toml
profiles = ["dev-profile", "prod-profile"]
regions = ["us-east-1", "eu-west-2"]
combine = true
report_name = "monthly_finops_summary"
report_type = ["csv", "pdf"] # For cost dashboard. For audit, only PDF is used.
dir = "./reports/aws-finops" # Defaults to present working directory
time_range = 30 # Defaults to 30 days
tag = ["CostCenter=Alpha", "Project=Phoenix"] # Optional
audit = false # Set to true to run audit report by default
trend = false # Set to true to run trend report by default
YAML Configuration Example (config.yaml
or config.yml
)
# config.yaml
profiles:
- dev-profile
- prod-profile
regions:
- us-east-1
- eu-west-2
combine: true
report_name: "monthly_finops_summary"
report_type:
- csv
- pdf # For cost dashboard. For audit, only PDF is used.
dir: "./reports/aws-finops"
time_range: 30
tag:
- "CostCenter=Alpha"
- "Project=Phoenix"
audit: false # Set to true to run audit report by default
trend: false # Set to true to run trend report by default
JSON Configuration Example (config.json
)
{
"profiles": ["dev-profile", "prod-profile"],
"regions": ["us-east-1", "eu-west-2"],
"combine": true,
"report_name": "monthly_finops_summary",
"report_type": ["csv", "pdf"], /* For cost dashboard. For audit, only PDF is used. */
"dir": "./reports/aws-finops",
"time_range": 30,
"tag": ["CostCenter=Alpha", "Project=Phoenix"],
"audit": false, /* Set to true to run audit report by default */
"trend": false /* Set to true to run trend report by default */
}
Example Terminal Outputs
aws-finops
aws-finops --audit
Using Amazon Q to analyse the aws-finOps report and deleting the resources (Eg: unused disks, Eip)
Note: The AWS user whose access key and secret key we are using must have the necessary permissions to delete resources (e.g., EC2 Full Access or Administrator Access).
To Install:
Choose the appropriate download command based on your system architecture and glibc version:
Standard version (glibc 2.34+)
Linux x86-64
curl --proto '=https' --tlsv1.2 -sSf "https://desktop-release.q.us-east-1.amazonaws.com/latest/q-x86_64-linux.zip" -o "q.zip"
To install Amazon Q CLI
Unzip the installer:
unzip q.zip
Run the install program:
./q/install.sh ./q/install.sh --force # To run as root user
By default, the files are installed to
~/.local/bin
.
Note:Once Amazon Q is installed, we need to select the login method as Builder ID or Pro license. It will generate a code, and we need to open the provided URL to confirm the login method
Using the AWS FinOps dashboard audit report, Amazon Q can analyze the report and assist in deleting unused volumes and Elastic IPs in the account.
Using prompt
Deleting the resources
Example PDF Reports
Export Formats
CSV Output Format
When exporting to CSV, a file is generated with the following columns:
CLI Profile
AWS Account ID
Last Month Cost
(or previous period based on time range)Current Month Cost
(or current period based on time range)Cost By Service
(Each service and its cost appears on a new line within the cell)Budget Status
(Each budget's limit and actual spend appears on a new line within the cell)EC2 Instances
(Each instance state and its count appears on a new line within the cell)
Note: Due to the multi-line formatting in some cells, it's best viewed in spreadsheet software (like Excel, Google Sheets, LibreOffice Calc) rather than plain text editors.
JSON Output Format
When exporting to JSON, a structured file is generated that includes all dashboard data in a format that's easy to parse programmatically.
PDF Output Format (for Audit Report)
This is the export format for reports generated using the --audit
flag. Currently, audit reports are only exported to PDF.
When exporting to PDF, a file is generated with the following columns:
Profile
Account ID
Untagged Resources
Stopped EC2 Instances
Unused Volumes
Unused EIPs
Budget Alerts
Cost For Every Run
This script makes API calls to AWS, primarily to Cost Explorer, Budgets, EC2, and STS. AWS may charge for some API calls (typically $0.01
for each API call, check current pricing).
The number of API calls depends heavily on the options used:
Cost Explorer & Budgets: Typically 3
ce:GetCostAndUsage
and 1budgets:DescribeBudgets
call per profile processed.STS: 1
sts:GetCallerIdentity
call per profile processed (used for account ID).EC2:
1
ec2:DescribeRegions
call initially (per session).If
--regions
is not specified, the script attempts to check accessibility by callingec2:DescribeInstances
in multiple regions, potentially increasing API calls significantly.If
--regions
is specified, 1ec2:DescribeInstances
call is made per specified region (per profile, unless--combine
is used, where it's called once per region for the primary profile).
To minimize API calls and potential costs:
Use the
--profiles
argument to specify only the profiles you need.Use the
--regions
argument to limit EC2 checks to only relevant regions. This significantly reducesec2:DescribeInstances
calls compared to automatic region discovery.Consider using the
--combine
option when working with multiple profiles from the same AWS account.
The exact cost per run is usually negligible but depends on the scale of your usage and AWS pricing.
Subscribe to my newsletter
Read articles from Balaji directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Balaji
Balaji
👋 Hi there! I'm Balaji S, a passionate technologist with a focus on AWS, Linux, DevOps, and Kubernetes. 💼 As an experienced DevOps engineer, I specialize in designing, implementing, and optimizing cloud infrastructure on AWS. I have a deep understanding of various AWS services like EC2, S3, RDS, Lambda, and more, and I leverage my expertise to architect scalable and secure solutions. 🐧 With a strong background in Linux systems administration, I'm well-versed in managing and troubleshooting Linux-based environments. I enjoy working with open-source technologies and have a knack for maximizing performance and stability in Linux systems. ⚙️ DevOps is my passion, and I thrive in bridging the gap between development and operations teams. I automate processes, streamline CI/CD pipelines, and implement robust monitoring and logging solutions to ensure continuous delivery and high availability of applications. ☸️ Kubernetes is a key part of my toolkit, and I have hands-on experience in deploying and managing containerized applications in Kubernetes clusters. I'm skilled in creating Helm charts, optimizing resource utilization, and implementing effective scaling strategies for microservices architectures. 📝 On Hashnode, I share my insights, best practices, and tutorials on topics related to AWS, Linux, DevOps, and Kubernetes. Join me on my journey as we explore the latest trends and advancements in cloud-native technologies. ✨ Let's connect and dive into the world of AWS, Linux, DevOps, and Kubernetes together!