# How to count distinct based on the sum of another column?

I dataset with a list of "pulls" for each day and the "setter" associated with that pull. I have a calculation through the ETL for this dataset that sums up the total sales per setter for the previous 30 days. My objective is to count the number of distinct setters who have a sum of 0 total sales for the time range filtered on the card. How would I do this on a beast mode? I can't do this through ETL as I need this count to work dynamically based on the date range the user chooses for the card.

Ex:

If I were to look at the distinct number of setters for 9/30/22 the number would be 1 because Test User 1's sum of total sales in the last 30 days exceeds 0 and Test User 2's total is 0, so Test User 1 does not count in the setter count but Test User 2 does.

If I were to look at the distinct number of setters for 9/25/22 - 9/30/22 the total would be 0 as both test user's sum of total sales would exceed 0 for that time frame.

Is this calculation possible through a beastmode?

Tagged:

• Contributor Hi @lb1234 ,

Assumption - The 3 columns "Pull Date" (A), "Setter"(B) and "Total Sales Last 30 Days"(C) are coming from the ETL.

What I understand from the example is, if the count of "Total Sales Last 30 Days" is greater than 0, it should not be counted. Which conversely means, if there was no sale in the past X many days, then that be counted as 1?

So you could write it as,

COUNT(DISTINCT (CASE WHEN SUM(  `Total Sales Last 30 Days`) = 0 THEN `Setter` END)) OVER (PARTITION BY `Setter`)

This would work. Attaching Test Data

Let me know if this does not work for you!

Thanks

'Happy to Help'

• Contributor Hi @lb1234 ,

Assumption - The 3 columns "Pull Date" (A), "Setter"(B) and "Total Sales Last 30 Days"(C) are coming from the ETL.

What I understand from the example is, if the count of "Total Sales Last 30 Days" is greater than 0, it should not be counted. Which conversely means, if there was no sale in the past X many days, then that be counted as 1?

So you could write it as,

COUNT(DISTINCT (CASE WHEN SUM(  `Total Sales Last 30 Days`) = 0 THEN `Setter` END)) OVER (PARTITION BY `Setter`)

This would work. Attaching Test Data

Let me know if this does not work for you!

Thanks

'Happy to Help'
• Thank you so much!!!!

• @Aditya_Jain That worked when looking at a single value card but when I'm looking at the count for the last 13 weeks in a bar graph, I'm running into issues as it's not grouping the days into weeks. Do you know why this would be?

• @Aditya_Jain when I try and get an average of the number of tasks by setter using that calculation, I'm getting a divide by 0 error. How do I fix this? Here is my calculation:

COUNT(DISTINCT `Task.ActivityID`) / (COUNT(DISTINCT (CASE WHEN SUM(`Is Setter`) = 0 THEN `Setter.FullName` END)) OVER (PARTITION BY `Setter.FullName`))

• Contributor on this calculation, not going into the correctness of the formula, but trying to resolve divide by zero

COUNT(DISTINCT `Task.ActivityID`) / (COUNT(DISTINCT (CASE WHEN SUM(`Is Setter`) = 0 THEN `Setter.FullName` END)) OVER (PARTITION BY `Setter.FullName`))

I assume this is part of a case statement,

to avoid divide by zero, please put a case before this, where you can check if your denominator which I suppose i s ' (COUNT(DISTINCT (CASE WHEN SUM(`Is Setter`) = 0 THEN `Setter.FullName` END)) OVER (PARTITION BY `Setter.FullName`)) ', if this is already 0, then please make the entire calculation as 0.

Please let me know if this works.

Thanks

'Happy to Help'
• @Aditya_Jain it still did not work. I've wrapped it in a case statement, please see below:

Case when (COUNT(DISTINCT (CASE WHEN SUM(`Is Setter`) = 0 THEN `Setter.FullName` END)) OVER (PARTITION BY `Setter.FullName`)) = 0 then 0 else COUNT(DISTINCT `Task.ActivityID`) / COUNT(DISTINCT (CASE WHEN SUM(`Is Setter`) = 0 THEN `Setter.FullName` END)) OVER (PARTITION BY `Setter.FullName`) end

• Contributor Hi, can you please provide a sample dataset with the mentioned columns so that I can test it and let you know a solution.

'Happy to Help'