Pulling Azure Analysis Services logs from Azure Log Analytics into PowerBI using Kusto Query Language (KQL)
We have very large Analysis Services(SSAS) cubes with billions of records and hundreds of users so we need to be able to monitor the performance of queries. 95% of this universe are users of reports built for them by knowledgeable report builders. Those report builders are the 5% we are looking to monitor.
The developers normally build within dev, however, there are situations where 1-off reports are being developed or a new report is being developed and there is an impact on production.
This method of reporting is near real-time vs real-time for 2 reasons. First, SSAS doesn't send information during the run itself like say SQL Server does. It tells you when something starts and then when it finishes. The good news about that is the 'QueryEnd' event has all the calculated metrics (start-time,end-time, total time, etc) so you don't need to capture both 'QueryStart' and 'QueryEnd'. Second, the system behind the Log Analytics Workspaces is an ADX(Azure Data Explorer) which itself requires time to ingest. We see about 5 min delay.
To interact with the ADX cluster you need to write a language called KQL (Kusto Query Language), which looks like a cross between SQL and Unix scripting. Below is the query.
AzureDiagnostics
| where ResourceProvider == 'MICROSOFT.ANALYSISSERVICES'
AND Resource in ('<myserver1>','<myserver2>')
AND OperationName == 'QueryEnd'
AND NTDomainName_s == 'AzureAD'
| project ServerName = Resource, CubeName = DatabaseName_s,QueryId = RootActivityId_g, QueryStart = StartTime_t, QueryEnd = EndTime_t, Duration = Duration_s, CPUTime = CPUTime_s, Success = Success_s, Error = Error_s,Query = TextData_s
You start with the table you want to query AzureDiagnostics (you can do joins, but they are beyond the scope of this article) and then you put a | (pipe) to take that output and move it to the next filter. In this case a where clause. After that filter you push those results to a project which is a select statement, but unlike SQL is not required. I am using the project because I want to not only create user-friendly names but reduce the number of columns (for efficiency on both ends).
The next step is to convert this into a Power BI (M query). This is under the Export option. Take that query and run it from within PowerBI and you now have a dataset to build your reports off of. The downside is you have to manually refresh the data, however, there is a mechanism to make this streaming. If there is interest in the streaming or any question, please let me know in the comments.
Subscribe to my newsletter
Read articles from Josef Richberg directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
Josef Richberg
Josef Richberg
I've been working in and around SQL for over 3 decades. I am currently working on making our business intelligence systems self-driven by using an event-based architecture. This involves providing curated data in some of the largest SQL Azure Analysis Services cubes ever built (over 6 billion records in 1 table alone). I've developed several APIs leveraging a serverless Azure architecture to provide real-time data to various internal processes and projects. Currently working on Project Dormouse; Durable select messaging queue that enables polling processes to extract specific messages.