Need some guidance to count records in a dataset and then count of a subset

I have a dataset that is made up of a number of records or sales orders, there is a completed date and this would make up the first count which is easy count of SalesOrder Completed Date. The issue is the second subset, there is also a survey date which I want to count the number of surveys taken based on the survey date.

This dataset is a blend of two the SalesOrder and the Survery data, with a left join on the SalesOrder based on OrderID.

When I do the count of both I always get the same value for both, it is counting all the Ordercompleted dates that have a Survey date that falls within the date range.

I want to be able to get the count of all Completed Orders and then the count of all Surveys taken on those closed Orders.

This seems very straightforward but I can't come up with the two different sums, it always comes up with the same number for both. I've tried numerous beastmodes including a Closed Flag on the SalesOrder where Closed Date is not null and a survey count on the Surveydate is not null and tried doing a sum of both and still only get the same number.

Let me know if there is a differenct approach I should be taking or something I'm overlooking.

Thanks in advance any help on this.

Randy

Comments

  • Hi Randy,

     

    as far as I understand, there may be

    a) an issue with the join (left joins keep all lines in the left dataset and add only matching lines on the right) but this may not be the case, just make sure the join doesnt drop all orders that do not have a survey related to it.

    b) given what you say here 

     

    When I do the count of both I always get the same value for both, it is counting all the Ordercompleted dates that have a Survey date that falls within the date range.

    I suspect that if the issue is not the join, your date field (for graphing) is the survey date - this would mean Domo only uses those fields as dates, and counting the corresponding order dates would mean that you are in the same situation as in a), namely that all orders that do not have a survey date attached to it are not shown in the graph (the situation is slightly different as to where this affects your graphing, in version a they are not in the data, in b they are ignored).

    c) or, the fact that there is an equal number is a sheer coincidence and the data is all fine - are you using COUNT() or COUNT(DISTINCT ), if you are using beastmode?

     

    If it is neither of these three, I think we will need more info or some dummy data to figure this out.

    You might also want to take a look at this Dojo post that was dealing with a similar issue based on two date columns to see if it might give you extra insight.

     

    Let me know what the issue was when you figure it out!

    HTH

  • I've taken a look at the dataset and it is returning the expected data, I have a list of RepairOrders and if you go far to the right to the Survey date column and the RepairOrder1 column not all of them are populated, many are blank, representing that no Survey has been done, so doing a count on any of these should work.

    I also thought about the survey date being the issue, and early on it probably was, but when I swithched it to Delivered Date it usually fails to deliver anything.

    I'm trying to display these results in a table view so any of them should work.

    I'm using a count when in the Beastmode but I've also simply tried using the count in the aggregate view in the data field with no luck.

    It really doesn't make sense that its not working and I'm sure its somthing simple but I just can't find it.

    Randy

  • I was reading through the post you suggested and it did turn on some lights, I think the issue is that although both of those records have dates if I have any date filtering at all it is going to remove on side of the data or the other as there is no shared date. I'm going to try and add a Common Date field and filter the date range on that and see what it does.

    I'll let you know.

    thanks

    Randy