Can this card be built without pre aggregating the data

Jones01
Jones01 Contributor

HI,

As the title says can this be built without having to pre aggregate the data via a view or ETL.

I have hourly data for 14 days. 7th Jan 2019 to 20th Jan 2019. data set is much larger but as an example. So the data is datetime and a count figure.

e.g.

07/01/2019 01:00 6

07/01/2019 02:00 9

07/01/2019 03:00 10

07/01/2019 04:00 12

07/01/2019 05:00 15

...

...

08/01/2019 01:00 9

08/01/2019 02:00 33

08/01/2019 03:00 122

08/01/2019 04:00 11

08/01/2019 05:00 66


I need the average count for each day of the week.

So the average for a Monday, Tuesday etc.

I can do this in SQL and I can also do this using the group by in the domo views and ETL by aggregating the data to the day of the week and then using the average in the card builder.

The problem is I want the user to be be able to filter the hours that make up the aggregation on the fly within the card

e.g. just see between midday and 3pm etc.

Does anyone know how I can achieve this as once the group by is done I have lost the hours in the card analyser and using a window function within beast mode won't allow anything like 

avg(sum(incount) OVER (partition by DATE(datetime), DAY(datetime))

Thanks

Best Answer

  • RobSomers
    RobSomers Coach
    Answer ✓

    I think you can do this within a card by creating a beast mode using the DAYNAME() function with your date column that you can use as a column or your x-axis. Then the average for each day of the week would be another beast mode:

    SUM('amount_column')/COUNT(DISTINCT DATE('datetime'))

    **Was this post helpful? Click Agree or Like below**

    **Did this solve your problem? Accept it as a solution!**

Answers

  • RobSomers
    RobSomers Coach
    Answer ✓

    I think you can do this within a card by creating a beast mode using the DAYNAME() function with your date column that you can use as a column or your x-axis. Then the average for each day of the week would be another beast mode:

    SUM('amount_column')/COUNT(DISTINCT DATE('datetime'))

    **Was this post helpful? Click Agree or Like below**

    **Did this solve your problem? Accept it as a solution!**

  • Jones01
    Jones01 Contributor

    That is perfect thank you. Easy when you know how!

  • @Jones01 Make sure to mark my post as accepted so that this thread gets marked as solved!

    **Was this post helpful? Click Agree or Like below**

    **Did this solve your problem? Accept it as a solution!**