Sum Distinct Not working Correctly in the Total
Best Answer

The SUM DISTINCT function is interesting in that it only sums the distinct values in the column you are referring in your Beast Mode calculation.
For instance, let's consider an example where you have two projected load values for 20240501 (10, 20) and three projected load values for 20240601 (30, 10, 10). The SUM DISTINCT output for 20240501 will be 30 since all of the values are unique for that specific date. Additionally, the SUM DISTINCT output for 20240601 will be 40 since it skips the duplicate values (10 appears under that date twice). Thus, you have 30 for 20240501 and 40 for 20240601.
Where it gets interesting is that the Total row will compute the value 60 in this example, rather than 70 which would make a lot more sense. The reason it does this is because the total row itself sums the distinct values and skips the duplicates. So it only considers 10 + 20 + 30 = 60 since the value 10 appears three times between both of those dates in this example.
With that being said, my recommendation is to create this calculation in the dataflow itself rather than in a Beast Mode calculation since you have more functionality there to work with and can avoid these confusing outputs.
0
Answers

Is "Projected Loads" itself a sum? The grand total might be deduping the values that give you the monthly totals.
0 
Is there a reason to do a distinct sum rather than a straight sum? There are probably nondistinct values that make up your monthly totals.
Let's say July was: 100, 100, 100 for a total of 300; August was 50, 50, 50 for a total of 150, so your totals table would look like:
July: 300
August: 150
Your overall distinct sum would be 150, not 450.
Please 💡/💖/👍/😊 this post if you read it and found it helpful.
Please accept the answer if it solved your problem.
0 
@ColinHaze Have you tried using a FIXED function instead of a SUM Distinct?
0 
The SUM DISTINCT function is interesting in that it only sums the distinct values in the column you are referring in your Beast Mode calculation.
For instance, let's consider an example where you have two projected load values for 20240501 (10, 20) and three projected load values for 20240601 (30, 10, 10). The SUM DISTINCT output for 20240501 will be 30 since all of the values are unique for that specific date. Additionally, the SUM DISTINCT output for 20240601 will be 40 since it skips the duplicate values (10 appears under that date twice). Thus, you have 30 for 20240501 and 40 for 20240601.
Where it gets interesting is that the Total row will compute the value 60 in this example, rather than 70 which would make a lot more sense. The reason it does this is because the total row itself sums the distinct values and skips the duplicates. So it only considers 10 + 20 + 30 = 60 since the value 10 appears three times between both of those dates in this example.
With that being said, my recommendation is to create this calculation in the dataflow itself rather than in a Beast Mode calculation since you have more functionality there to work with and can avoid these confusing outputs.
0 
I ended up taking Jonathans Advice and just running it through the dataflow.
Thanks all!1
Categories
 All Categories
 1.7K Product Ideas
 1.7K Ideas Exchange
 1.5K Connect
 1.2K Connectors
 292 Workbench
 4 Cloud Amplifier
 8 Federated
 2.8K Transform
 95 SQL DataFlows
 602 Datasets
 2.1K Magic ETL
 3.7K Visualize
 2.4K Charting
 695 Beast Mode
 43 App Studio
 39 Variables
 658 Automate
 170 Apps
 441 APIs & Domo Developer
 42 Workflows
 5 DomoAI
 32 Predict
 12 Jupyter Workspaces
 20 R & Python Tiles
 386 Distribute
 111 Domo Everywhere
 269 Scheduled Reports
 6 Software Integrations
 113 Manage
 110 Governance & Security
 8 Domo University
 30 Product Releases
 Community Forums
 39 Getting Started
 29 Community Member Introductions
 98 Community Announcements
 Domo Community Gallery
 4.8K Archive