Getting started with troubleshooting a slow Power BI report page

Performance always is an issue, isn’t it?  Throughout a career that wandered through IIS (My web pages are slow), SharePoint (My web parts/lists/search queries/indexes are slow), SQL Reporting (My SSRS reports/stored procedures/excel pages are slow) and now Power BI (My visuals are taking forever to load), troubleshooting slow performance is always a big part of what I do.

Troubleshooting Power BI visuals can be a little tricky.  There aren’t any obvious dials or gauges to look at, you can’t spin up perfmon and attach it to Power BI desktop and the logs, while impressive looking, won’t help you narrow in on the poorly written measure that is killing your performance.  What I am going to layout next is a quick approach that you can take to not only get a good look at the performance of a report page but how also you can narrow in on the measures that are dragging you down.

Quick Setup Note – I am using the customer profitability sample from Microsoft for my PBIX file.  Its visuals load super quickly but its a quick and easy download here.

First things first, your reports and data model need to be in the same PBIX.  We will be using Dax Studio to connect to the data model and run a trace so everything we are testing needs to be in the same PBIX.  If you have your visuals and data model in separate PBIX files, you will need to recreate your visuals in the PBIX where your data model live.

STEP ONE:

Create a blank report page.  Power BI desktop will load visuals on the report page that it opens when you open the PBIX so in order to capture a true idea of the page performance, you need to create a blank report page and save the PBIX with that page active.

STEP TWO:

Close and reopen your PBIX file.  If you did step one, you should be looking at a blank report page.

STEP THREE: 

Open Dax Studio and the ‘Connect’ screen should open.  Select your open PBIX dpcument as shown and select ‘connect‘.

STEP FOUR:

Once connected, click on the ‘All Queries’ button in the ribbon.  This actually starts a trace on your SSAS instance that is running in Power BI desktop.  Once the trace is ready, you will see ‘Query Trace Started’ in the output window.

STEP FIVE:

Return to your PBIX that you have open in Power BI desktop.  Click on the report tab that you wish to trace and let the page fully load.  Once the page loads, you can stop the trace by returning to DAX Studio, choosing the ‘All Queries’ tab and selecting the stop button.

STEP SIX:

Once the trace is stopped, click on the duration column header to sort the queries by duration.  As  I mentioned earlier, this demo is super fast so the ‘slowest’ query took 21ms but hopefully you get the point.  You know have a list of queries that were performed to build your page, along with the time it took to execute each of the queries.

STEP SEVEN:

Continuing on, double click on the query text in the ‘Query’ column.  The actual code used will show up in the editor section above the output section.  Now you can analyze the DAX being called as well as run an individual trace to dig in deeper.

DIGGING DEEPER:

At this point, you can run all of the DAX in the editor or you can highlight and run just sections of it, just like normal in DAX Studio.  If you enable the Query Plan and Server timings options, you can capture a trace and see the actual queries that are being passed to the formula and storage engine for processing.  Enabling the query plan option does just what it says, it gives you the query plans, both physical and logical, that were chosen to run the queries.

I have a long animated GIF below that shows turning on the query plan and server timings options, setting the ‘Run’ option to flush the cache each time I run a query, then running the query.  I then show where you can find the query plan and server timings information.  Since I got the whole screen in the GIF, its a pretty lousy resolution but perhaps if you open it in another tab, you can see enough detail.

Questions, comments, suggestions on digging deeper into Power BI visuals/reports performance?  Throw me a comment or hit me up on Twitter – @szBigDan.

The anatomy of a DAX mistake – Analyzing DAX Performance

I have been working with a retail customer that uses a 445 date table.  Matt Allington over at PowerPivotPro wrote a great blog post on what a 445 date table is and how to create one but creating one wasn’t my issue because the customer provided me with the one that they use (Whew).

Instead, I was doing some custom time intelligence with DAX since I couldn’t use the built in functions due to the whacky date table.  Specifically, the customer wanted a lot of their metrics expressed in terms of last week, the last four weeks, the last 13 wks and total.

As a side note here, the interesting thing about DAX is that there are usually all sorts of ways to do the same thing.  I am not giving a template for how to solve this problem with the following approach, I am just using what I did to show you how I found my first (hah, hah) performance problem and solved it.

I started by defining a simple measure that would give me the latest date for the data that I had in my data set.

CURRENT WEEK ID = MAX ( ‘Date'[DateID] )

Since my Date table only has data for weeks that I have data in my fact table, this gives me the max Date ID for my dataset.  With this information in hand, I created some custom helper tables in my model.  The example below is the one created for the previous thirteen weeks helper table.

Rolling_13_Date =
VAR maxID = [CURRENT WEEK ID]
RETURN
FILTER (
‘Date’,
‘Date'[DateID]
>= ( maxID – 12 )
&& ‘Date'[DateID] <= ( maxID )
)

I made a helper table for each of the custom time periods, 13wk, 4wk and 1wk.  These tables are created dynamically when the data model is refreshed and they are joined back to the main data table as shown:

So the thinking was this, to make life easy for myself, any time that I had to present a measure over one of these time periods, I would just do the following:

My4WkMeasure = Calculate([MyMeasure], Rolling_4_Date)

Initial testing went great, it worked as expected and it was super easy to remember this approach.

Of course, I did mention that I when I am doing development, it is against a week of data right?  Yeh, probably should have mentioned that.  Once I pulled in all of the data, here is a sample of the blazing fast speed that my measure was cranking out:

 

 

That is the ‘Server Timings’ readout from Dax Studio.  My measure, just one single measure using this approach, took 4.5 seconds to load.  For such a simple measure (basically total sales), it actually would have taken almost 15 seconds if it didn’t spread the love out over multiple cores.  Even better, its not a lot of data being materialized, the time is all CPU time so I am making the Query Engine thing WAY too hard for such a simple calculation.

When I started digging into the query, I quickly saw my mistake.  I don’t quite understand the full details of it because I am not Marco Russo or Alberto Ferrari, but I could see enough from the queries shown in Dax Studio to understand what I did wrong.  By specifying an entire table as a filter instead of a column, the Query Engine performed a massive join that pulled in my entire rollup table, the main date table and the other connected rollup tables following the 1:1 relationships.  All of the columns in the new mashed up table were then used in such a massive WHERE clause that it actually overran the buffer in Dax Studio before it captured all of the text.  Usually DAX studio will cut off the text and tell you how many more lines were remaining but I think I overwhelmed it with my nonsense.  The resulting WHERE clause looked like it was filtering on every available value in all of the date columns in all of the joined date tables.

That is only a short piece of the never ending WHERE clause (yes it goes on and on my friend….).  Wow, I am not doing something right.

To try and limit the JOIN, I changed my measure to the following, this time specifying a column:

My4WkMeasure = Calculate([MyMeasure], Rolling_4_Date[DateID])

That’s more like it!  Total time 48ms.  Again, not exactly sure why its so much faster but if you look at the queries, the where clause is much more simpler.

Ok, wrapping up, what did I learn here?  Always specify a column in your filter expression when using calculate?  Always test your measures against production data?  Always keep Dax Studio handy?

I guess so, these are all good things to know.  More importantly, I got to immediately put to work the minding bending stuff that I am learning in the SQLBI.com course Optimizing DAX.  I am only half way through the course and already am using way more of Dax Studio than I knew even existed before I started.  This is not a paid endorsement, btw, I ponied up out of my own pocket to get a bunch of the videos that these two mad scientists cooked up and they have been worth every penny.

Using DAX Studio, I was able to performance test a measure by itself, quantify how bad a job I did with it, see where the slowness was coming from (yes its the WHERE Clause that doesn’t end…), make a change and immediate quantify how fast the measure was once I fixed it.

I will try to blog more perf scenarios as I come across them but for now, if you don’t know DAX Studio, and you are responsible for DAX/Power BI/SSAS Tabular performance, you need to learn it now.

And in case I didn’t firmly stick Lambchop into your ear with this post, enjoy: