Beastmode: Previous 7 Day Avg Comparison
I am working in a Mega Table transposing beast modes to get daily kpis. I would like to get the comparison of these daily kpis to avg for the 7 days preceding them: (current value - 7 day avg) / 7 day avg
I'd like this beast mode to appear in a column next to the daily value for each day. I'm just not sure how to get this to come through or if it's possible outside of an ETL. We attempted the ETL route but because of the filtering capability we'd like to use on the dashboard, an ETL will not work for us as the formulas need to be dynamic with the filterable fields.
Any suggestions?
Best Answer
-
If you want this to work the way you mentioned, you'll need to replicate all entries for the last 7 days + today on each day you want to graph (that way each entry will have all the columns and be affected by the filters you apply). I was working on a mock on what this would look like, but making the number of days to average controlled by a variable adds a whole lot more complexity to this, unless you define the max number and filter to keep up to that much data and then make sure you only include those that match your variable criteria. QQ, how should your average be performed if you have dates with no entries, should those be treated as 0 or ignored?
For my test I used a dataset that looks like:
This goes to a ETL to replicate entries up to 7 days ago (here you'd need to change this to be the top number of days you want to look back to when doing the average).
Which yields the following result:
Which can be graphed in a pivot table by GraphDate:
This way if you decide to break it down by different lenses or apply different filters, since each entry has it's full information replicated the average and comparison will update dynamically based on what you have. Keep in mind we're not filling blank dates in this approach (for that you'd need to change the ETL to use the DOMO - Calendar table instead of the list of unique dates that appear in your current data). For it to be responsive to a X days variable for the average you'd update the formula I currently used:
SUM(CASE WHEN Relation Flag = 'Previous' THEN spend END)/COUNT(DISTINCT CASE WHEN Relation Flag = 'Previous' THEN date_ END)
To something along the lines of
SUM(CASE WHEN Relation Flag = 'Previous' AND
date_
+ INTERVALVariableX
DAY >=GraphDate
THEN spend END)/VariableX
Just an idea
0
Answers
-
Hi @lbg123
When you say previous 7 days, is that a rolling 7 day avg you're looking for or are you looking for avg 7 days from today? For instance, today is July 11, so do you mean from July 4 - July 11th? If it's July 1, do you not need to see an average then or do you want the average from June 23- June 30?
John Le
Let me make you the me of your organization for Domo
Click here for more video solutions:
0 -
@DashboardDude 7 days previous. So for today (7/11/23), I'd want the average for 7/4/23 - 7/10/23
0 -
@lbg123 Got it.
Hope this helps:
John Le
Let me make you the me of your organization for Domo
Click here for more video solutions:
1 -
Love @DashboardDude solution, nice and clean! @lbg123 when you say 7 days average, are referring to the last 7 days from today, or the 7 days prior to each entry? This makes a huge difference on how the ETL needs to be built, also whether the average of the prior 7 days is meant to be responsive to the card filters or not.
0 -
@DashboardDude I appreciate the time and effort you put in here to try to help with my issue, unfortunately, I need the 7 day average to be the 7 days prior to each date, not just the last 7 days. I was able to grab this information using that Lag function in the Rank and Window tile in an ETL which works great for me on my group by functions that are sums of a column, the issue I run into is that some of the beast mode formulas when used in the Group By tile in an ETL are are not effective unless looking at specifically the data that I'm grouping by: Sales Channel, Deal and Date.
The workaround I came to was by duplicating the dataset in the ETL and creating Group By tiles with specific granularity for specific cards: Summary card with all sales channels and deals (Group by Date only), Summary card partitioned by Sales Channel (Group by Date and Sales Channel), Summary Card partitioned by Sales Channel & Deal (Group by Date, Sales Channel and Deal).
I then created a constant called "Granularity" that can be used to filter on each card to select the specific granularity that I wish to view the summary numbers at and then I appended these together:
The avg calculation is then done on a beast mode in the card as the number of days that I'm averaging is based on a variable that's adjusted on the dashboard:
While I wish that I could do this on a Mega Table with the Transpose feature rather than having to create this whole dataflow to achieve the results I need, I just can't figure out how that would be possible.
This is my final result using the ETL:
1 -
If you want this to work the way you mentioned, you'll need to replicate all entries for the last 7 days + today on each day you want to graph (that way each entry will have all the columns and be affected by the filters you apply). I was working on a mock on what this would look like, but making the number of days to average controlled by a variable adds a whole lot more complexity to this, unless you define the max number and filter to keep up to that much data and then make sure you only include those that match your variable criteria. QQ, how should your average be performed if you have dates with no entries, should those be treated as 0 or ignored?
For my test I used a dataset that looks like:
This goes to a ETL to replicate entries up to 7 days ago (here you'd need to change this to be the top number of days you want to look back to when doing the average).
Which yields the following result:
Which can be graphed in a pivot table by GraphDate:
This way if you decide to break it down by different lenses or apply different filters, since each entry has it's full information replicated the average and comparison will update dynamically based on what you have. Keep in mind we're not filling blank dates in this approach (for that you'd need to change the ETL to use the DOMO - Calendar table instead of the list of unique dates that appear in your current data). For it to be responsive to a X days variable for the average you'd update the formula I currently used:
SUM(CASE WHEN Relation Flag = 'Previous' THEN spend END)/COUNT(DISTINCT CASE WHEN Relation Flag = 'Previous' THEN date_ END)
To something along the lines of
SUM(CASE WHEN Relation Flag = 'Previous' AND
date_
+ INTERVALVariableX
DAY >=GraphDate
THEN spend END)/VariableX
Just an idea
0
Categories
- All Categories
- 1.8K Product Ideas
- 1.8K Ideas Exchange
- 1.6K Connect
- 1.2K Connectors
- 300 Workbench
- 6 Cloud Amplifier
- 9 Federated
- 2.9K Transform
- 102 SQL DataFlows
- 626 Datasets
- 2.2K Magic ETL
- 3.9K Visualize
- 2.5K Charting
- 753 Beast Mode
- 61 App Studio
- 41 Variables
- 692 Automate
- 177 Apps
- 456 APIs & Domo Developer
- 49 Workflows
- 10 DomoAI
- 38 Predict
- 16 Jupyter Workspaces
- 22 R & Python Tiles
- 398 Distribute
- 115 Domo Everywhere
- 276 Scheduled Reports
- 7 Software Integrations
- 130 Manage
- 127 Governance & Security
- 8 Domo Community Gallery
- 38 Product Releases
- 11 Domo University
- 5.4K Community Forums
- 40 Getting Started
- 30 Community Member Introductions
- 110 Community Announcements
- 4.8K Archive