In the ever-evolving world of data architecture, decision-makers are often faced with a foundational choice: ETL (Extract, Transform, Load) or ELT (Extract, Load, Transform)? For years, ETL was the gold standard—especially when on-prem systems and batch processing dominated the landscape. But as cloud computing, real-time analytics, and modern data stacks surged, so did the practical advantages of ELT.
This post explores real-world scenarios where ELT didn’t just work—it outperformed traditional ETL by a wide margin. These insights are for teams stuck at the crossroads of modernizing their stack, scaling operations, or simply tired of overcomplicating their data pipelines.
Use Case 1: Real-Time Data Visibility for Marketing Dashboards
A global marketing firm approached our team with a common problem: delays in reporting. Their ETL process took over 8 hours to run, rendering “daily” dashboards outdated before stakeholders even opened them.
By shifting to ELT, we pushed raw data into a cloud warehouse as it was created—no waiting. From there, lightweight transformations inside the warehouse made it possible to update dashboards in near-real-time.
This switch drastically improved executive visibility and marketing agility. Visualizing this shift was only made possible through a smarter data foundation, powered by our data engineering consulting services in Austin, Texas. The decision to transform data after loading it gave teams the flexibility to run multiple transformation versions and improve queries without touching upstream logic.
Use Case 2: Enabling Advanced Analytics in Healthcare
Healthcare providers are under immense pressure to turn data into actionable insights, fast. In one scenario, a client with strict HIPAA compliance rules needed to merge EHR data from various sources to identify trends in patient outcomes.
Previously, their ETL toolset struggled with data volume, versioning issues, and schema changes. Our team moved them to an ELT architecture, which loaded all raw data into a secure cloud environment and executed transformations using SQL-based logic—directly within the warehouse.
The result? Analytics teams were empowered to iterate faster, adapt to regulatory changes, and produce more accurate models using services like our advanced analytics consulting services in Texas. Because the raw data was always available, models could be retrained or compared against historical versions instantly—something traditional ETL couldn’t support without redesign.
Use Case 3: Agile Product Analytics with Tableau
An e-commerce client needed to understand how product features impacted user engagement, but their ETL processes were rigid and hardcoded. Every schema change required days of rework, blocking fast experimentation.
We introduced a cloud-native ELT approach that funneled all user interaction logs into their warehouse continuously. With the data already accessible, business analysts could use advanced Tableau consulting services in Texas to explore metrics in real time, apply custom calculations, and even test hypotheses without involving engineering.
This dramatically improved how fast teams could respond to product performance questions, iterate on UX experiments, and deliver reports that aligned with rapidly changing business priorities. It wasn’t just faster—it was finally scalable.
Why ELT Wins in the Cloud Era
The shift to ELT is not about replacing ETL everywhere—it’s about knowing when to use the right tool for the job. ELT thrives when:
Data volume is high
Schema evolution is frequent
Real-time insights are critical
Multiple teams need access to raw or semi-processed data
You want analytics to evolve without changing core logic upstream
These advantages are amplified when paired with robust warehouse technologies like Snowflake, BigQuery, or Redshift. ELT enables data engineers to build scalable pipelines, analysts to iterate quickly, and business leaders to make informed decisions faster.
It’s More Than a Trend—It’s a Strategy
Many organizations hear “ELT” and assume it’s just another buzzword. But as the above use cases show, it’s a strategic advantage when deployed correctly. ELT doesn’t just streamline the data journey—it creates room for innovation.
If your team is still stuck debating whether to move to ELT, it might be time to explore your current bottlenecks. Are your reports always delayed? Are schema changes dragging down your entire dev cycle? Is your warehouse underutilized? These are signs that an ELT-centric approach may unlock the performance you’ve been chasing.
Our team at Dev3lop has helped companies across industries migrate to modern data stacks with ELT at the center. Whether it’s integrating with Tableau, Power BI, or MySQL consulting services and other backend systems, our software innovation approach is built to scale with your growth.
In the age of data overload and attention scarcity, ELT isn’t just faster—it’s smarter.
If you’re ready to rethink how your business handles data transformation, now’s the time to explore solutions that scale with you—not against you.
The business world runs on data, but data alone isn’t enough—companies need actionable insights presented clearly and accurately. Dashboards have become essential tools for decision-making, empowering everyone from frontline employees to top executives. Yet, most dashboards fall short, leaving professionals disillusioned and frustrated. Recently, we performed an in-depth audit of 10 dashboards from various industries to pinpoint why these critical tools often underdeliver. Surprisingly, we consistently found the same three mistakes that impeded usability and limited effectiveness. By examining these dashboard missteps, you can ensure your data storytelling empowers your team instead of confusing them. Here’s what we discovered, why it’s important, and most importantly, how you can overcome these common pitfalls.
Mistake #1: Prioritizing Visual Appeal Over Functionality
Data dashboards exist primarily to empower decision-making. Unfortunately, in our audits, we found that many teams tend to prioritize aesthetic considerations over functionality. Initially appealing dashboards quickly frustrate users when they struggle to grasp the information they need at a glance. Leading to misinterpretations, poor insights, slow decision-making, or worse—inaccurate decisions based on unclear data.
Why Prioritizing Functionality Matters
Dashboards should be user-centered, prioritizing clarity and speed of understanding over excessive visual flair. Beautiful visuals are meaningless if they don’t clearly communicate the metric or trend you’re displaying. Every element—charts, graphs, icons—should serve the single purpose of clearly, rapidly delivering actionable insights to the audience.
Many dashboards we analyzed sacrificed usability for extravagant visual elements or cluttered graphs that distracted from the core information. Complex visuals without clear intent confuse users, creating barriers to data-related productivity. As professional analysts, our aim is to structure data visualization that reduces cognitive loads, guiding users seamlessly from observation to comprehension to action.
How to Fix It
Evaluate your dashboard by asking: Do visuals communicate simply and directly? Start by clearly defining the dashboard’s primary audience and their needs. Follow established best practices like clean titles, simplified visualizations, logical grouping, and minimizing reliance on decorative effects that contribute little to understanding.
Proper dashboard development integrates best practices of ETL—Extract, Transform, Load—methodologies to prepare data in meaningful ways. Effective ETL ensures you transform and organize your information carefully before visualizing it. See our breakdown on why and how ETL steps significantly impact data analytics insights here: ETL in Data Analytics: Transforming Data into a Usable Format.
Mistake #2: Ignoring Data Governance and Security Best Practices
Data security continues to make headlines, and yet many dashboard creators fail to implement proper governance or security standards. In our audits, alarming patterns emerged: teams failing to track who accessed sensitive datasets, overlooking controlled access through role-based permissions, or even neglecting the importance of data privacy altogether. Poor data governance may not impact visuals directly, but it severely impacts data trust and reliability, leading executives and teams to question the accuracy and integrity of all analytics efforts.
Why Governance and Security Should Be Top Priorities
Organizations in fintech and other regulated sectors, as carefully examined in our article on The Importance of Data Privacy in Fintech, bear critical compliance responsibilities. The security concerns inherent in poorly governed dashboards create significant compliance and business risks. Without proper governance, dashboards expose sensitive information, cause data breaches, and threaten brand reputation.
Implementing effective data governance practices—proper access controls, clear security protocols, and transparency regarding data origins and transformations—creates confidence in the accuracy and authenticity of the insights presented. Proactively embedding governance practices like role-based access ensures only authorized individuals see sensitive or critical data.
How to Fix It
Build governance directly into the dashboard design workflow from day one. Follow best practices laid out in our guide here: Examples Where ETL Supports Data Governance and Security. Explicitly define the roles that have access, placing security at the center of your dashboard strategy. Consider leveraging reliable analytics consulting services, like ours, which emphasize comprehensive data privacy and governance as a foundational element of all dashboard development.
Mistake #3: Overlooking the Importance of Data Context and Diversity
Raw metrics rarely explain themselves. Data dashboards must offer sufficient context or risk presenting misleading or ambiguous insights. Diverse perspectives on the data, incorporating enough context across different datasets, industries, and perspectives, prevent users from drawing false conclusions. Yet this critical step frequently gets neglected as companies rush to roll out dashboards. Without context, teams make poor operational decisions, directly affecting organization revenues, efficiency, and market position.
Why Data Context and Diversity Matter
Dashboards should always tie back clearly to strategic business goals, clearly interpreting data assumptions, collection methods, and limitations. Providing diverse data perspectives can paint a fuller, clearer picture, helping decision-makers understand what’s driving trends beyond initial assumptions. For example, our recent deep-dive article Unleashing the Power of Data Diversity highlights how diverse datasets inform stronger strategic outcomes.
Ignoring data diversity or context often stems from failing to communicate between distinct analytical roles clearly. Do your teams clearly understand the roles involved in managing your data dashboarding solutions? Check out our comparison in Data Engineer vs. Data Analyst for clarification on how distinct roles influence data usage strategies.
How to Fix It
Incorporate explicit context indicators, captions, or annotations in your dashboards. Highlight multiple viewpoints through diverse data layers to clarify misleading patterns. Establish an ongoing collaboration forum between data engineers, analysts, and business stakeholders, bringing these roles closer toward a unified understanding. Proactively offering context prevents misunderstandings and ensures truly effective dashboard-driven decisions.
Our Key Takeaway: Dashboards Are Strategic Investments, Not Just Displays
Your data dashboards aren’t mere visual extras—they embody strategic business assets that drive smarter decisions, innovation, and competitive advantage.
For example, harnessing the potential of dashboard analytics has transformed fintech companies profoundly, as explained in our analysis, The Power of Big Data in Fintech.
Addressing dashboard errors through thoughtful planning, robust governance, and adequate context transforms dashboards from flashy displays into powerful strategic tools.
If you’re grappling with ineffective dashboards or unsure where key weaknesses lie, professional analytics consultants can help confront these common pitfalls head-on. Consider investing in expert guidance from an advanced analytics consulting group like Dev3lop—our team specializes in refining dashboards aligned with industry-leading governance, context-driven analysis, and strategic objectives. Learn more about our strategic dashboard and analytics solutions today by exploring our Advanced Analytics Consulting Services.
Data is Your Competitive Edge—Make Sure Your Dashboards Leverage it Properly
Transform your dashboards into real strategic engine-drivers by addressing functionality, data governance, security, and data diversity and context. Meaningful dashboards that empower strategic clarity are achievable—one clear insight at a time. Ready for an update?
Building dashboards in PowerBI quickly is important because decision makers are eager to start using these rocket ships we are creating. However, if you’re new to PowerBI that may be asking a lot! Tracing is helpful because it empowers us to quickly create a solution and design from scratch.
What is tracing? Drawing over lines on a superimposed piece of transparent paper, and with figma, you will be able to do this digitally speaking. Allowing you to trace over any designs to abstract your own.
Tracing dashboards is a good way to recreate something net new and offers a fast path for getting people talking about your dashboard designs.
In this article, you will learn to become a master of making powerful designs from scratch, and this will empower you to Create dashboards in PowerBI quickly. Here’s a lot of screenshots to show you what you’re going to be building and potentially a template you can copy and paste into your next PowerBI Design.
Create visual documentation for PowerBI Design
Here at DEV3LOPCOM, LLC, we passionately believe visual documentation improves project deadlines. Plus, allows for fast data product creation and we want to show you how we would create a dashboard from scratch without any direction on the style or design.
Figma works, but any app that allows tracing over an image will work, and in this blog we will show you how to create this template.
We show the steps to tracing the design, and adding it to PowerBI. This can help you operationalize your templates and improve your dashboarding speed across any dashboarding product.
About this PowerBI dashboard Data and our Goals
First, lets learn about the data and establish goals about our workload to keep us focused on an objective.
All data should have a brief description, otherwise it’s hard for others to collaborate with your data sources.
Using the following data about Social Media and Mental Health, was recently released by University of Maryland in July 2024.
Our goal is to quickly generate a dashboard to help others learn PowerBI. However we have thin requirements, it’s fun to pretend this is a real world software consulting engagement, and similar to a real world use case in a busy business environment, perhaps people are too busy to give us insights.We must research and learn on our own.
About data: The dataset encompasses demographic, health, and mental health information of students from 48 different states in the USA, born between 1971 and 2003.
How do I get my CSV Data into PowerBI?
Open your PowerBI software. You don’t need to buy anything, just go grab the software and get started with me.
In the Home tab, click Get Data. Then select Text/CSV.
Once we have our CSV data open, you may notice we have weird Column headers that aren’t sensible to anyone on the dashboarding end.
This is typical in a lot of APIs, Data Warehouses, and Data Engineering in general is ripe of columns not being named correctly for each team. Luckily for us, PowerBI can change column names with great ease.
Finding Artwork to Trace Your Dashboard
First, we need to start with learning about “artwork.” When learning to draw, an art teach will ask you to trace something 100 times, and then by the 100th time you’ll be drawing it better.
Same with the internet, we often are reverse engineering each others design to improve our design. In this process we will find some artists we enjoy, choose one, and trace our dashboard on this design.
I like using Dribbble to find ideas and learn about modern approaches. It has a lot of really stylish content, and it’s easy to start here as a dashboarding guru.
I search for ‘modern dashboard style…
If working with a client, I will find 3 designs and then ask them to choose one. Then I’ll build everything around this template. I like using figma because it’s easy enough for people to dig into the weeds, and see they can access the design elements.
Pretend our client suggest the following design.
Okay, paste the dashboard we are asked to mimic into figma and lets start tracing.
You’ll notice as you do this you’ll start to create your own unique design to your dashboarding solution.
Start tracing design for PowerBI Dashboard
Cover the surface with a square.
Once hidden completely, lets edit transparency hitting 5 on keyboard. This should adjust the transparency.
Okay, keep flowing. Next same thing for side menu. Trace it. But before we go, adjust the edges to be rounded.
Easy enough in figma, grab little white ball and pull it down until it hits the line we are tracing. adjusting one side adjust all 4 sides.
Okay, hit the side menu.
Next, TEXT overlays. And button overlay with squares.
I prefer starting with a highlighted button so i know the sizing, then replicate that size across. Alt drag and drop for similar copy paste of previous object.
Working through buttons should be easy, let the software guide you to make it perfect too. Notice this has a 7 pixel gap.
Skipping ahead…
Now that we have this style, lets see what it looks like in PowerBI.
Adding your Figma design to PowerBI is simple. It’s a file.
Export the file to your computer.
Add image to PowerBI.
Resize dashboard so it fits cleanly.
Remove padding, this is my least favorite thing to have to do in Tableau and PowerBI. These apps automatically pad everything for some reason, haha.
Making decisions about our new Figma Style for PowerBI
In the beginning stages it’s about speed and repeatability. In more advanced dashboard development Figma saves a lot of time.
Next, lets duplicate our work area, and move the sub button navigation for today to the right side.
This is good enough for PowerBI. But before we leave just yet, lets dive into how we can improve the color pallet. I’m using coolors for an easy one.
Now, start to style your dashboard so that it’s appealing. Don’t spend too much time here because chances are the design will change, and you’re just trying to make it look decent. Use corporate colors so you’re following the “designers” pattern. They can send your a pdf file with the correct style guide, which improves this process, but today we are tracing and coming up with our own style guide.
As you’re applying color, start to focus on subtle details…
Improving PowerBI Dashboard with Logos, Style, and Strategy
Logos make dashboards pop. You know you can easily grab one, so grab the most recent logo. Don’t edit peoples logos, use what they supply online.
I’m choosing the source data logo, to help represent the source information because putting my logo here would not be a factual representation of the effort.
Now, notice what happens when I size it next to the buttons, depending on your screenshot and size of dashboard to be traced, in Figma, it’s subtle… Notice my sizing is subtly off and I can’t seem to make it exact, I generate black guide bars… aim to sync up for “perfect”… people will use your dashboard more often if it’s synced up.
In this example/screenshot I’m demonstrating how lining up this logo is a little more tedious than allowing figma to define things by snapping edges, I created black guide lines to help me follow a consistent flow from top to bottom. This is a kind of “hawk eye” or “pixel perfect” strategy I need you to deploy to create powerful dashboards in any reporting software or designed front-end!
Before we part, a few more subtle wins to consider as you perfect your traced design for PowerBI.
This will give a very nice clean view. In figma, click the red square, paste the image. Very easy process if you created the space for the icon. As you do this selection of icons, realize nothing is perfect, we are prototyping, get something in there because that’s the key, fast/repetitive!
Notice how we made some decisions that moved us away from the original design, this is called “making it your own.”
One more layer of decisions to clean it up.
The strategy here is making things clean and lined up, using LINES to guide ourselves. Delete these guide lines once you’ve mastered this technique and keep duplicating to avoid having to do this process again…
Here’s my work station, notice I’m starting to document what goes inside of buttons, and the documentation is in the same screen. This helps with identifying where our hard work belongs.
The header looks a little close to the first square, however a good starting point, we can optimize that later. The point of using guides/lines is the important part of this training.
Choosing cool icons for PowerBI Navigation
Since we are prototyping and not rushing to production, we need a simply PNG file for icons. Google search will bring up a lot of options you can trace, “black icon mental health heart.”
Simply click a square in figma, and ctrl+v paste.
This is why we created that square in tracing section, it outlines my sizing requirements.
Now, we have two buttons, logo. Things are cooking. Plus, custom icons. Always tell people it’s easy to change icons, this is just a prototype.
Many tracing apps can be found in the figma community. Great for icon tracing. This creates a vector trace of the heart/brain icon.
Once you trace the svg, you can color the file and it’s a vector rendering. I like changing the color to match the pallete.
Now, to finalize the visual. I use more guides but in the shape of a square this time. Find what works best for you.
Insert image into PowerBI
Woot, you’re here! You’re doing you’re own design based on a tracing.
I hope you’re proud of your drawing. If not, simply grab more ideas and trace until you’re satisfied.
Open Insert Tab, then click image. Navigate to the image you created in Figma. Group it and export it.
Start to play with dashboard sizing based on your image size.
Adding your first charts on new style in PowerBI
Okay, so you’re adding your new traced design to PowerBI as an image. You fixed the canvas.
And you’re beginning to add charts.
I’ve started with the easier charts, that feel very global. Like the amounts of states accounted for in the overall survey. The differences between gender, and the general health column popped to mind considering our button says General Health too. Even though it’s a place holder, perhaps we can go into detail about general health as a button too. Also, I like making actionable KPI to flow with buttons, so end users know if they click that bar chart, perhaps they will learn more about General health, and also the button General health will take them there too.
Scaling up on your new Traced PowerBI Design Template
Okay, people are going to ask you to change your PowerBI Design, for example pixels aren’t perfect, maybe 2 pixel boarder around charts isn’t great.
This is why I love having my dashboard design in Figma. Easy to edit. Copy and paste and start new styles.
In powerbi, similar process, right click dashboard tab, and click duplicate to duplicate your dashboard.
Now, delete the background image, and add a new image. Should look like this if you’re still adding charts. As long as you don’t move boxes, you’re safe to simply add back the new image and it will fit perfectly.
This is a good sign, you’re not depending on a reporting platform to manage your design elements. You can slap this background into any reporting software.
Now, you have a duplicate tab in PowerBI, I went with nuerophism, a cool design technique that makes it feel like it’s popping off the screen because of the light and dark shadows. Do you notice the differences in the shadows?
Conclusion to Tracing designs with with Figma for PowerBI Desktop
While working with designers, often we are given screenshots of artwork, and tracing allows us to gain what we need to be successful.
I hope you enjoyed this tutorial on creating quick PowerBI products using Figma to trace.
Keep perfecting your craft and let us know if you need help with any dashboard designing services!
We will add more training like this in our articles here on dev3lop, stay tuned.
Although, we started as a Tableau Consulting Company, we have been navigating into more and more PowerBI the past few years.
Ensuring that even the next data expert won’t find it too daunting.
To excel in quick calculations, it’s essential to identify and address slower ones.
#1 Problem with Slow Tableau Workbooks
Solving slow Tableau workbooks is often a calculation optimization game.
Then the migration of transformations, Boolean style calculations for example are easily pushed to SQL because SQL does Boolean logic with ease, so why make Tableau do this for you? This is a subtle win and as you continue you’ll find bigger wins in our blog below.
Think of Tableau as a tool you don’t need to over complicate. You can protype, transform, build a data product, and then stress about the “improvements” we discuss below in the near future.
Stressing these tiny details now will slow down your project, and stress out business users. Do it when no one is looking or when someone asks “why is this slow?”
During Tableau Consulting engagements, we see it’s easy to move your slow moving calculations into your database after the prototyping phase, and consider pushing heavily updated calculations to your SQL end the hardening phase that you do at the end. Anything being changed often is best to keep in your Tableau Workbook until everyone has completed their apples to apples.
Optimizing Calculations in Tableau Desktop for Better Performance
When it comes to Tableau Desktop, writing fast and efficient calculations isn’t just a nice-to-have—it’s a must for performance and scalability. A calculation that works is great, but one that works fast is better, especially as data grows. Let’s break down why certain choices in your calculations can have a massive impact on performance, focusing on the example provided.
The Problem: Slow String-Based Calculations
Here’s the first example:
if month(date) >= 5 then "blue"
else "orange"
end
Why is this slow? Strings.
Strings Are Heavy: Every time Tableau processes this, it’s comparing strings instead of lighter, numeric values. Strings take up more space and are slower to process than integers.
The else Isn’t Necessary: If your logic doesn’t need an else, don’t add one just to fill in. else assigns a default value—if that value isn’t relevant, you’re doing extra work.
The Improvement: Simplifying the Logic
Here’s a slightly improved version:
if month(date) >= 5 then "blue"
end
This avoids unnecessary processing by dropping the else. If the condition isn’t met, Tableau will simply return NULL. However, this still relies on strings, which slows things down.
The Better Option: Switch to Numbers
if month(date) >= 5 then 1 // blue
elseif month(date) <= 4 then 2 // orange
else 0 // filter out
end
This is a solid step forward. Why?
Databases Love Numbers: Integer-based logic is much faster because databases and Tableau’s data engine process integers far more efficiently than strings.
Strings have thousands of possible values.
Integers have only 10 basic values (0-9) in a single digit, making calculations simpler and faster.
Future-Proof Logic: By using integers, you’re not just optimizing today; you’re setting yourself (and your team) up for easier scaling and maintenance tomorrow. Want to add another category? It’s just another number.
Ease of Filtering: Returning 0 for filtering out irrelevant data reduces additional logic elsewhere, streamlining workflows.
Why Does This Matter?
When you write calculations that rely on strings, Tableau (and the underlying database) has to:
Compare values character by character.
Manage much larger datasets because strings require more storage.
Perform extra lookups if you’re working with case-sensitive text.
Switching to numeric logic tells Tableau to focus on lightweight, easy-to-process values. Over time, this can lead to noticeable performance improvements, especially with large datasets or frequent dashboard updates.
Pro Tip: Comment for Clarity
This isn’t just about optimizing calculations; it’s about teaching better practices. Add comments like this:
if month(date) >= 5 then 1 // blue
elseif month(date) <= 4 then 2 // orange
else 0 // filter out irrelevant months
end
By documenting your choices:
You make your logic easier for others to understand.
You reduce the need for future troubleshooting.
You create a shared knowledge base, improving team productivity.
The Bottom Line: Calcs need to be faster!
When building calculations in Tableau, think beyond “does this work?” to “how efficiently does this work?” Opt for integer-based logic over strings whenever possible. It’s a small change with a big payoff, especially as your dashboards grow more complex. Less work for Tableau = faster insights for you.
Got other optimization tips? Let me know in the comments!
A Faster Tableau Calculation
The simplest and fastest approach? Stick with numbers and Booleans:
if month(date) >= 5 then 1 // blue
else 0 // orange
end
Why It Works: You’re just typing numbers. The comments explain the logic for human readers without bogging down Tableau with unnecessary strings.
Scalable: This approach is ideal for larger datasets and complex workbooks. As your project grows, you’ll appreciate the simplicity and speed of integer-based logic.
For an even lighter touch:
month(date) >= 5
Boolean Flag: This returns TRUE or FALSE directly, which is incredibly efficient. Boolean logic is the leanest and fastest calculation type Tableau can process.
Why Writing Fast Tableau Calculations Matters
Writing fast calculations isn’t just a power move for your own dashboards—it’s a cornerstone for building a thriving Tableau community. Here’s why it matters:
User Adoption: Fast calculations mean responsive dashboards. That translates to better user experiences and higher adoption rates for your work.
Community Growth: When you optimize your calculations, you share best practices that help others master Tableau’s native features.
Future-Proofing: Hundreds of slow calculations will drag your workbook down over time. Optimized logic ensures your dashboards remain scalable and maintainable.
Let’s keep the momentum going: Write fast Tableau calculations, build amazing dashboards, and grow the community together. Pretty dang fast, right? 🚀
Today, we would like to highlight the functionality of Date Buckets, which is how we like to think of it mentally, and others call it Period-over-Period Analysis within Tableau Desktop. Both periods are buckets of dates and work great with min(1) kpi dashboards and often used in our Tableau Consulting engagements.
This blog delves into a method for date calculations to be used as trailing periods of time, to gain access to quick change between two periods in Tableau. In other words; We are focusing on identifying the last two periods in your data source, and the end user supplies a value to increase those buckets based on a date part you pick.
This approach enhances the efficiency and clarity of your analytical processes with Tableau and is easy to re-use. There are many ways to write this calculation and this is one way to write the calculation.
between dates filter
In Tableau this between date filter will create two calendar inputs, most executives don’t want to click anything.
It only takes 3 steps to build self generating, automated (not static set filters), date buckets in tableau desktop that trail with your max date in the date column [w].
lol, type this stuff or paste the code coming from this tutorial.
Below please find my quick win tutorial as a means of quickly winning… on any Tableau workbook with a date and a parameter.
We will be using the SuperStore Subset of data.
Which comes with every license of Tableau Desktop. In your data, you probably have a date. Use that date and follow along with these next two steps.
To begin, you need a date, and a parameter.
Step 1, make a date variable named W.
Create a new calculated field in tableau desktop, call it W.
make a simple variable W in place of your date. your date goes in this calculated field.
Now make the parameter.
Step 2, make a parameter variable named X. It’s an integer.
This will be the number of ‘X’ per period of analysis.
make a simple variable X in place of your parameter.
Paste the calculation below in any workbook with a Date and Parameter.
Above, if you followed along, you will not need to make any major changes to the calculation.
if DATETRUNC('month', [W])> DATEADD('month', -([X]+ datediff('month',{MAX([W])},today())) , TODAY()) then "Current Period" //make this 0 elseif DATETRUNC('month', [W])> DATEADD('month', -([X]*2+ datediff('month',{MAX([W])},today())) , TODAY()) then "Previous Period" //make this a 1 else "Filter" //make this a 2 END //[W] = date //[X] = parameter
Drag drop this on to the view, right click filter, filter filter…
Now, only two buckets of time are available. You’re welcome!
Automated period over period analysis in Tableau
You’ve just implemented automated date buckets in Tableau, allowing end-users to control visualizations using the bucket generator. Personally, I find the tool most effective when using it in a daily context rather than a monthly one. However, the monthly option provides a convenient way to encapsulate dates within distinct periods, while the daily granularity offers a simpler and more immediate view.
Having a rapid date divider or bucket automation at your disposal is highly advantageous. It empowers you to visually highlight disparities between two date periods or employ the calculations for logical flagging, subtracting values, and determining differences, all without relying on the software to construct these operations through window calculations.
Optimization date buckets or period over period in Tableau
Optimization #1: remove LOD calculations
Nothing against LOD calcs, except they are slow and built to help users who don’t know SQL.
{max(W)} seeks to find the max date, you can find it easier using a subquery in your select statement. If you don’t know what that means, ask your data architect supporting your environment to add the max(date) as a column, and have it be repeated per row too. They will know what to do or you need a new data architect.
Optimization #2: stop using % difference or difference table calculations
Nothing against table calculations, except they are slow and built to help users who don’t know SQL.
Optimization #3: change strings to integers.
Nothing against strings, except they are slow.
It’s likely not your fault that you’re using strings in 2018 with if statements, it’s probably because someone taught you who also did not know how to write optimized Tableau calculations.
Optimization #4: ‘month’ date part… add a swapper.
The Datetrunc is used to round the dates to the nearest relative date part, that’s just how I explain it easily.
Date part can be a parameter.
DATEPART(date_part, date, [start_of_week])
NO I Don’t mean the Function Datepart.
DATETRUNC(date_part, date, [start_of_week])
YES I Mean Date_part, which is scattered in the calculation and easy enough to replace with a parameter full of date_parts. Now end user can play a bit more.
Optimization #5: remove max(date), add an end date parameter…
Remove {max(date)} or the subquery of max(date) explained above because you can give your end user the opportunity to change the end date using parameter.