Tips on How to Increase Tableau Dashboard Performance

Tune your Tableau Workbook like a Gibson Guitar

“My report is so slow!” How many times have you heard that from a client (or from yourself)?
I’m sure I am not alone when it comes to the frustration of working with a slow Tableau dashboard! And of course, the client, who is using the report daily to make business decision, finds this even more annoying.

Slow dashboards lead to an understandable decline in usage so we need to do all we can to enhance the user experience and add value to their role rather than adding to their stress and frustration.

I can hear the shouts of “But the client specifically asked for the report to be built like this!” or, “The client wants ALL these drop-down filters and wants EVERYTHING squeezed on one page!”

Yes, that happens to all of us. First and foremost, it’s up to the viz developers to educate the client on the tool and its functionalities, explaining the value behind, “Less is more”. We need to find that balance between giving the client what they think they need and building a well performing dashboard that provides all they require in an efficient user-friendly way.

The best way to isolate the issues leading to the slow performance is to do a Performance test. These tests can be on Tableau Server or on Tableau Desktop.

How to do Performance Testing on Tableau Server:

1. Open your workbook and paste :record_performance=yes& between the “Question mark” and the “Colon” toward the end of the URL as below
a. Workbookname/viewname?:record_performance=yes&:iid=1
2. The performance icon will then appear.
3. Click on the Refresh icon and then once the report is refreshed, click on the Performance icon and a Performance report will be generated.

How to do Performance Testing on Tableau Desktop:

1. Open your workbook and click on Help>Settings and Performance>Start Performance Recording
2. Make a number of selections on your dashboard, changing filters, drilling into detail etc.
3. Click on Help>Settings and Performance>Stop Performance Recording and a Performance report will be generated.

Always ensure you download and save your initial test results. Once you have made performance enhancing changes to your workbook, run another test and compare the results to ensure the changes are making a positive impact.
Tableau performance tests will highlight the outputs that are taking up the most time and hindering the overall performance of your dashboard.
The most common culprits are:
1. Computing Layouts
2. Compiling Query
3. Executing Query

Complex Workbooks Decrease Performance!

If layout computing is high on the time-consuming list, your workbook is too complex. You need to simplify your dashboards to optimise performance. It’s not essential to include everything on the landing page of your report and keep in mind that large crosstabs can be particularly workload intensive.

Use your landing page to deliver a high-level overview of the data. Then allow the user to interact with the view using of Tableau’s functionality to drill down to the detail they require.

For example: A retailer’s sales report:

The landing page may have a few blocks across the top showing total business numbers for:

1. Sales Value Current Year & Previous Year, YoY growth.
2. Sales Volume Current Year & Previous Year, YoY growth
3. Gross Profit %
4. Budget Value and Variance to Actual Value

Once that’s done you can start to fill the rest of the landing page.

• Below these blocks, you may choose to include Bar, Tree and/or Line graphs based on Region, Timeline/Months, and/or Categories. Don’t overdo it by trying to show too many graphs.
• Use these graphs as interactive action filters to change the high-level numbers, as well as isolating information on the other two secondary graphs based on user selections.
• Add tooltips to show the user additional information without squashing too much onto the graph.
• Add action menu filters to allow the user to drill to the lower-level detail on another sheet linked to the dashboard. This is where you could consider bringing in your crosstab table. When the user reaches this point on the report, they will have already reduced the data through their action filter selections and thus the queries will not have to work through as many rows. The crosstab will for example only be looking at:
• A product level sales vs budget table for the Furniture category in the Eastern Region for the 2nd quarter of the current year.

Compiling Queries on Tableau can also slow down your Workbook

This refers to the amount of time spent by Tableau generating the query. Slow performance issues highlighted here can be due to too many filters and or complex multi-level front end calculations.

How to use Filters the Right Way on Tableau:

Where possible cut back on quick drop-down filters and rather train the users on the benefits of interactive action filters instead. Not only are these more performant, but once the users get accustomed to them, they are a more efficient way of interacting with and navigating through the reports.

Look at replacing some filters with parameters where the data and required outcomes allow. Here are some useful tips:

• Asses your filters and, where possible make them “In Context”.
• Context filters generate a temporary table from which all “normal” filters are applied. However, these can negatively impact performance when used incorrectly. Be careful to only use Context filters on fields that significantly cut down the data and more importantly, are not changed very often.
• Use data source filters to reduce the volume of unnecessary data being queried with each action by the user.
• Avoid using “Only Relevant” filters. These add to the workload as Tableau needs to find the distinct values of each filter based on all other filters on the dashboard.
• Customise multi-select filters to show the “Apply button”. The user is then able to make all required selections on a specific filter before executing.

Using Calculations on Tableau to Optimize Performance

Level of Detail Calculations (LOD’s):

Often the data dictates that LOD’s and other complex calculations are unavoidable, however many of these LOD’s may have originally been created to remove duplications caused by joining tables.

In these cases, it is worth going back to your older workbooks and, where possible, change your data sources to be Relationships rather than Joins.

Tableau has introduced the Relationship functionality as a more efficient way of creating data sources from multiple tables. Relationships maintain the granularity of each table and thus remove the chance of duplications that often result from joins.

Once you have created your new data set, check your output and it is likely that you will no longer require the all the “duplicate removing” LOD calculations.

Back-end calculations:

Take a step back and relook your data.

• Are you trying to do too much on the front end?
• Are your complex nested calculations adding to the workload in every query?

If so, then consider moving some of the calculations into underlying database

Executing Queries on Tableau

Extended query execution times are often related to workbooks using live data sources. However, workbooks already using extracts may also have slow query execution due to size of the data source, which not only refers to number of rows, but width as well, too many complex front-end calculations, filters, and unnecessary complexity.

Extracts on Tableau

Always consider using extracts to speed up performance.

Live data vs Scheduled Updates to extracts:

Start with asking the question, “Is it a requirement for those Daily Sales reports to be live when the client is only looking at them in the morning and not again for the rest of the day?”

In my experience, generally, the answer is “No”. Create an extract of your data and publish it to server on a scheduled refresh. The refresh could be set to run daily in the early morning, updating the reports to show the previous day’s sales or as often as every 15 minutes or less. Base your choice of schedule on ensuring the users’ needs are met without overdoing it, whatever schedule you choose, working off a data source extract will always improve performance of the report versus a live connection.

The next question you need to ask is, “How much data does the client need for their analysis?”

Removing the live connection is only part of the benefit. We can also use an extract to reduce the size of the dataset. If the users only require the last three years for year-on-year growth comparisons in their reports, then there is no benefit to keeping ten years of data in the extract. The size just negatively impacts the speed of every click and subsequent query.

If the client panics that their data is “gone”, reassure them that the data is all still available in the data source and can be brought into the extract and thus, the report, with a quick change to the extract filter if ever required.

Additional tip when adding date-based extract filters, its best to select the “Relative Date” option to ensure you don’t have to keep going back in to update it as the years change.

A few more general tips to optimise performance on Tableau:

Filters are important!

Yes, I am repeating myself and I’m doing it on purpose. The old saying goes, “All roads lead to Rome”, I like to say, “Most dashboard performance issues follow from Filters”.

Filters were mentioned earlier as a reason for slow compiling of queries, they are also guilty of impacting query executions, so I felt it was a good idea to squeeze a reminder into this section of what to avoid and what to use.

• Avoid Quick drop-down filters and “Relevant only” filters
• Use Action Filters and parameters

Unused fields & Sheets

During the build and validation process we are constantly testing and making changes. This, of course, often leads to multiple versions of the same calculation. We’ve all seen those “copy of copy” fields, or the “Sales_new”, “Sales_NEW” and so they go on.

Performance aside these can get extremely confusing when you come back to your workbook in a few months’ time to make adjustments. Once your dashboard is complete, make sure you delete all unnecessary calculations.

Hide all unused fields before you create and publish your data source extract, this will reduce the width and thus volume of your dataset, meaning less work is required for each query.

Also remember to delete any unnecessary and test sheets from your workbook before publishing. Only keep what you need. And another side note, make sure you name your sheets and dashboards appropriately, you don’t want to come back to sheets 1 through 20 in a few months’ time, or even worse to “Sales”, “Sales (2)”, “Sales (3)” when “Sales (3)” has stock values on it.

Dashboard size

Avoid using “automatic size” for your dashboard. The report will have to adjust to the varying resolution of the user which negatively impacts its performance

Grand Totals and Sub Totals

The generation of totals slows dashboard performance, however most of the time they are a necessary requirement.

A performance enhancing workaround is to duplicate your sheet, remove the dimension from the column or row so you are only showing your total values.

Add a text field with the text “total”, add the sheet next to or below the crosstab table sheet as required and set your padding to zero to give the impression of it being all one sheet.


Avoid blends and cross data source calculations where possible. Consider doing the work in the underlying data source to create the data set you require or use relationships to build data sets from multiple sources tables wherever possible.

To Conclude

In conclusion, remember that the purpose of any reports or dashboards that we build is to answer a question through the visualization of data. We need to do this in a way that provides the user the required information in a format that is easy to understand and absorb, thus allowing them to make faster and more insightful decisions.

Dashboards that are too busy and overcomplicated tend to be slow, cumbersome, and difficult to navigate, this is the opposite of the goal we are trying to achieve.
Always keep the required outcome in mind and make use of Tableau’s best practises to achieve that in the most efficient user-friendly way.

Interested in more? Check out our product, Vantage Point. Vantage Point (VP) is a no-code, click & go business acceleration tool which enables data driven decisions across your business. It drives interactivity across all parts of your organization by communicating value (KPIs), autogenerating tasks with cutting-edge ML/AI technology and enabling users to combine VP’s ML/AI recommendations with their own analysis. You can finally track the exact ROI impact throughout your entire business with Vantage Point.

Sign up for a demo with the link below

Written by

Rene van Breukelen

Solutions Specialist

Data And Drinks – Meet the Snowflake and Vantage Team

Bringing communities together to connect, collaborate, and learn about Snowflake 13 June 2022 | 5 pm | Legacy Yard, Umhlanga, Durban hosted with You’re invited to join Snowflake & Vantage for an evening of Drinks, Data & delicious nibbles on the 13th June at 5...

Roll out The Red Carpet – dbt is ready to Change the Way We Transform Data.

I got introduced to DBT when I started at Vantage Data nearly a year back and although I’ve heard of DBT before, I really didn’t understand the full capabilities until I got my hands dirty. Looking back, all I can say is WOW! Using DBT together with something like...

The dbt Bootcamp: Transform your Data using Data Build Tool

Description Are you looking for a cutting-edge way to extract load and transform your data? Do you want to know more about dbt aka Data Build Tool and how to use it? Well, this is the course for you. Welcome to The dbt Bootcamp: Transform your Data using Data Build...


11 + 5 =