How to use AWS Cost and Usage Report (CUR) and dive deep into EC2 Instances Costs with an ARN level granularity

🚀 AWS users, are you looking for a more comprehensive way to manage your costs and usage data? The AWS Cost and Usage Report (CUR) can help you dive deeper into your EC2 instances costs with ARN level granularity, so you can make data-driven decisions to optimize your compute workloads.

👨‍💻 Here are some benefits of using the CUR:

✔️ Gain more visibility into your AWS costs and usage data with detailed reports. ✔️ Identify areas where you can optimize your compute workloads to save costs. ✔️ Get ARN level granularity for EC2 instances costs for more accurate insights. ✔️ Use your preferred visualization tool to analyze your data.

📈 By using the CUR, you can gain a better understanding of your AWS costs and usage and make informed decisions to optimize your compute workloads. This can lead to significant cost savings for your business in the long run.

👉 Let me help you with a step by step process to create CUR and help you with the SQL query using which you can dive deep into your EC2 Costs :

Creating Cost and Usage Report on your AWS Account :

This narrative leverages the billing data found in the AWS Cost and Usage Report (CUR). AWS CUR breaks down cost and usage on a granular level down to the hour or resource ARN and contains the most comprehensive set of cost and usage data available.

To create a CUR Report, follow detailed instructions at Creating Cost and Usage Reports. Consider the following options when creating the CUR:

  • Configure a new Amazon S3 bucket to store the AWS CUR. Let’s say demo-cur-report-XXX

  • Select “Include resource IDs”. Many billing records will automatically include the Resource ID, which can be useful context when trying to understand a segment of your bill.

  • Select “Automatically refresh your Cost & Usage Report when charges are detected for previous months with closed bills”. This lets you have the most accurate representation, as some refunds, credits, and AWS Support fees are calculated after the month is closed.

Figure 2. Create AWS Cost and Usage Report

  • Select “Overwrite existing report” for Report versioning. The entire month of data is written multiple times per day as the CUR is delivered. Overwriting lets you only have the latest copy of the current month, and avoid storing duplicate data.

  • Select “Hourly” to receive the highest available granularity.

  • Select “Amazon Athena” for Enable report data integration for. This will automatically select the (Apache) Parquet compression type, which is required for Athena to run efficient queries for the Grafana dashboards

Figure 3. S3 Delivery Options for AWS Cost and Usage Report

Figure 3. S3 Delivery Options for AWS Cost and Usage Report

It may take up to 24 hours for AWS to deliver your first report into the S3 bucket.

Step 2: Creating Athena Database and Querying AWS CUR using Athena

To streamline and automate the one-time integration of your Cost and Usage Reports with Athena, AWS provides a AWS CloudFormation template. This will ensure that your latest cost and usage information is always available to Athena—with no additional work required to prepare your data for analysis. Follow the detailed instructions on Setting up Athena using AWS CloudFormation templates to complete the integration with Athena.

A database in Athena is a logical grouping for tables you create in it. Creating a database in the Athena console query editor is straightforward.

To create a database using the Athena query editor manually,
  1. Open the Athena console at https://console.aws.amazon.com/athena/

  2. On the Editor tab, enter the Hive data definition language (DDL) command CREATE DATABASE myDataBase. Replace myDatabase with the name that you want to use. For restrictions on database names, see Names for tables, databases, and columns.

  3. Choose Run or press Ctrl+ENTER.

  4. To make your database the current database, select it from the Database menu on the left of the query editor.

Step 3 : Using SQL Query to pull the data from CUR.

Once the database is created, you can then use the below SQL query to dive deep into your EC2 Costs by changing the particular attributes as you require :

-- For ${date_filter}, options are listed here: https://wellarchitectedlabs.com/cost/300_labs/300_cur_queries/query_help/#filtering-by-date
-- Example:
--      year = '2022' AND month = '07'
--                 -- OR --
--      line_item_usage_start_date >= now() - INTERVAL '3' month

SELECT
   bill_payer_account_id,
   line_item_usage_account_id,
   line_item_product_code AS AWS_Service,
   line_item_resource_id AS Resource_ID,
   ROUND(SUM(line_item_usage_amount), 2) AS Usage_Amount,
   pricing_unit,
   ROUND(SUM(line_item_unblended_cost), 2) AS Cost,
   product_product_family,
   line_item_operation AS API_Operation,
   month,
   year
FROM
--  ${table_name} -- Replace ${table_name} with your customer's CUR table, for example: customer_all
WHERE
--  ${date_filter} -- Replace ${date_filter} with an option from above.
   AND line_item_product_code = 'AmazonEC2' -- insert AWS Service
GROUP BY
   line_item_product_code,
   bill_payer_account_id,
   line_item_usage_account_id,
   pricing_unit,
   line_item_resource_id,
   product_product_family,
   line_item_operation,
   month,
   year
ORDER BY
   ROUND(SUM(line_item_unblended_cost), 2) DESC

You can then download the results in CSV format to check and dive deep into the various anomalies.

Conclusion :

In this blog post, I showed you how to create a cost and usage report and use the data in cost and usage report to dive deep into your Elastic Compute Cloud costs with the help of a customized SQL query and Athena. You can see more details of cloud optimizaton, cost management in my Linked in Profile. Please let me know what you'd like to see and dive deep into my upcoming blogs! Until next time.

6
Subscribe to my newsletter

Read articles from Nithin Chandran R directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Nithin Chandran R
Nithin Chandran R

I am an AWS Certified Solution Architect and Cloud Practitioner | AWS Technical, Business, Sales and Cloud Economics Accredited ,Well Architected Framework (WAR) Proficient : Senior Technical Customer Support Specialist for Enterprises (Concierge) at Amazon Web Services (AWS). I've been helping the large enterprise customers of AWS for years with their Billing, Accounts, Cost Optimization, Cost management, Migration and with technical automation solutions related to commitment chargeback and showback models.