Using EVALUATE to Integrate and Analyze timestamp-based Data in Oracle Analytics Cloud

Introduction

In the context of an analytical project for a customer, we faced a structural complexity within the data model. The customer had two distinct ingestion streams that were isolated from each other, lacking any direct connection. This data separation posed a significant barrier to the type of analysis the client required, which necessitated effective integration between the two streams.

The two streams had the following characteristics:

  1. Events: The first stream consisted of information about specific events. Each event was described by key attributes, including:

    • Event type

    • Event code

    • Start and end timestamps

    • Event duration

  2. Measurements: The second stream included data related to measurements. These, too, were structured with precise details, such as:

    • Measurement type

    • Start and end timestamps

    • Measurement duration

The customer had a clear but technically challenging requirement: to use Oracle Analytics Cloud (OAC) to generate a list that identified the timestamps where specific events and measurements overlapped. This required developing a logic for temporal intersections between the two streams, producing a list of timestamps that simultaneously met specific event and measurement criteria. To ensure flexibility and usability, the system needed to allow users to dynamically filter data via a configurable prompt, enabling the selection of the desired event and measurement types for analysis.

Beyond this primary functionality, the client also requested an additional layer of interactivity. By clicking on any of the identified timestamps, users needed the ability to view a detailed report. This report would provide a "trigger telemetry”, offering a complete view of all events and measurements recorded within a range starting from the selected timestamp and extending 15/30/60 minutes prior. The goal was to offer a historical perspective of the events and measurements leading up to the point of interest, enabling deeper analysis.

This challenge was far from straightforward. On one hand, it was necessary to make two isolated data streams “communicate” through complex integration. This was achieved by designing an SQL function capable of dynamically managing the temporal intersection logic, processing data from both streams, and generating coherent and actionable output. On the other hand, we needed to implement a mechanism to call this procedure directly within Oracle Analytics, ensuring a seamless user experience with interactive features and high performance.

The entire process required a thorough understanding of the client’s needs, careful design of the integration logic, and continuous performance optimization. This case study exemplifies how advanced technologies like Oracle Analytics can be leveraged to deliver sophisticated, tailored analytical solutions.

Building the Function

We developed a function in SQL Developer that accepted input parameters corresponding to the prompts configured in Oracle Analytics Cloud. This function was designed to process and manage telemetry data from machinery.

create or replace function  
test_trigged_telemetry (
user varchar2, 
serial_num varchar2, 
alarm_code varchar2,
omac varchar2,
level_access varchar2, 
recipe_code varchar2, 
OEE varchar2, 
speed varchar2, 
rs varchar2, 
delta varchar2, 
temperature varchar2, 
state varchar2)  
return varchar2

Purpose of the test_trigged_telemetry Function

  1. Data Cleanup: Remove any existing data for a specific user.

  2. Dynamic Processing: Handle optional parameters (e.g., alarm code, speed, OEE) and construct SQL queries flexibly.

  3. Data Insertion: Populate a table with telemetry data, associating timestamps and values from different sources.

Key Components of the Function

  1. Function Declaration
    The function uses the PRAGMA AUTONOMOUS_TRANSACTION keyword, enabling it to execute transactions independently of the calling context. This ensures that insert and delete operations are committed separately.
PRAGMA AUTONOMOUS_TRANSACTION;
  1. Initial Cleanup
    Before inserting new data, the function deletes existing records for the specified user:
query_delete := 'delete from test_telemetry where user = ''' || user || ''''; 
execute immediate query_delete; 
commit;

This ensures no duplicates or conflicts.

  1. Dynamic Query Construction
    The function dynamically constructs queries for each optional parameter (e.g., alarm_code, omac_state).

Example of dynamic construction:

case  
   when alarm_code is null then query_alarm_code := 'null as alarm_code';  
   when alarm_code is not null then query_alarm_code := ‘xxxx’;  
end case;
  1. Handling Optional Data
    If one or more parameters are not provided, the function inserts null values into the respective fields. Conditional CASE blocks determine which fields to include in the final result.

  2. Data Insertion
    After constructing the subqueries, the function combines them into a single INSERT INTO query. Each parameter is processed individually to ensure accurate data mapping.

Implementation on Oracle Analytics Cloud

Once the database logic was finalized, the next step was integrating it with Oracle Analytics Cloud for seamless user interaction.

To achieve this, we enabled the EVALUATE function in the cloud administration console (Figure 1). This function allows custom SQL commands to be executed directly from Oracle Analytics, providing the flexibility needed to call database-defined procedures or functions.

Figure 1. Evaluate Settings in the OAC console to call DB functions.

We then created a new analysis in OAC with a column containing the following formula:

EVALUATE('TEST_TRIGGED_TELEMETRY(%1,%2,%3,%4,%5,%6,%7,%8,%9,%10,%11,%12)',
VALUEOF(NQ_SESSION.USER),
(@{pv_serial_num}['@']{'xxxxxxxxxxx'}),
'@{alarm_trig}',
'@{omac_trig}' ,
'@{level_trig}',
'@{recipe_trig}',
'@{oee_trig}',
'@{speed_trig}',
'@{rs_trig}',
'@{delta_trig}',
'@{temp_trig}',
'@{state_trig}')

Interactive Column

The analysis was designed so this column functioned as an interactive button. By clicking the column header, users could automatically trigger the EVALUATE function, which executed the SQL logic in the database. This approach simplified interaction, minimized manual errors, and ensured consistent logic application.

Temporal Intersection Analysis

It was developed a dedicated analysis to calculate temporal intersections between events and measurements based on user-defined criteria. This was achieved using the INTERSECT function (Figure 2).

Figure 2. Analysis to find temporal intersections between events and measurements.

A crucial filter was included to distinguish between concurrent queries from different users:

"Test Telemetry"."USER" = VALUEOF(NQ_SESSION."USER")

This ensured unique session management and avoided shared cache issues.

Dynamic Timestamp Navigation

To enhance interactivity, we added a dynamic navigation feature. Users could click on an intersected timestamp and select a time period of interest (e.g., 15 minutes, 30 minutes, or 1 hour prior to the timestamp). This interaction redirected them to a new page showing a detailed report for the selected time range, providing comprehensive trigger telemetry (Figure 3).

Figure 3. Interaction Settings: the user can choose between one of the displayed links, once the popup is available.

Conclusion

The implementation of this solution marked a significant breakthrough for the client, enabling them to overcome a structural challenge in their data model and integrate two initially isolated data streams. Leveraging the advanced capabilities of Oracle Analytics Cloud, such as EVALUATE, interaction with presentation variables, and dynamic navigation, we successfully created a system that combines the analytical power of the database with the flexibility and intuitiveness of the user interface. The result was an analytical environment that not only meets the client's requirements but also enhances data usability, enabling faster analyses and more informed decision-making.

0
Subscribe to my newsletter

Read articles from Veronica Orciuoli directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Veronica Orciuoli
Veronica Orciuoli

Hello! I am a consultant specializing in Oracle Analytics Cloud, where I assist my company in transforming data into meaningful insights. My passion for data analysis complements my curiosity for discovering new stories. When I'm not immersed in the data world, you can find me binge-watching the latest shows or discussing my favorites with friends and colleagues.