Rounding Data within card

Has anyone ever had the issue where the card data table shows duplicate values, which are being rounded, but these values ARE NOT in the actual dataset. Is there a filter or something I can do in analyzer to prevent this?

Tagged:

Best Answer

  • marcel_luthi
    marcel_luthi Coach
    Answer ✓

    It'd seem many of your Count and Average columns are being retrieved from actual values from the Dataset your ETL generates, rather than being an aggregation at the card level:

    This would explain why you get two entries for the same half hour period for the same account. Ideally any aggregation and calculation that should be dynamic and that can be affected by locally applied filters should not be performed via ETLs but either by using Aggregation on existing fields, or via Beast Modes that do the aggregation and oppertaions needed (like if you need a ratio or the like, you'll do something like SUM(value1)/SUM(value2))

Answers

  • @Stucker Does your table have subtotals or any aggregated columns?

  • @MichelleH - No subtotals or aggregated but there are a few beast modes

  • @Stucker Can you please share your beast modes and sorting?

  • @MichelleH - sure! Here is a full table overview - filters are just removing account names, sorting is by account-date-half hour. The beast modes just turn the percentage values into actual percentages 'ABANDONED (%)* 100'. I have another card just like this one without the half hour field that has no duplicate issues.. so its either the half hour or the average fields that are causing this.. the beast modes are on the other cards as well. Thank you!

  • @Stucker Does it still have the same behavior if you remove the sorting?

  • @MichelleH - it does, I am going to try to rebuild the entire dataset/etl and see what happens.

  • marcel_luthi
    marcel_luthi Coach
    Answer ✓

    It'd seem many of your Count and Average columns are being retrieved from actual values from the Dataset your ETL generates, rather than being an aggregation at the card level:

    This would explain why you get two entries for the same half hour period for the same account. Ideally any aggregation and calculation that should be dynamic and that can be affected by locally applied filters should not be performed via ETLs but either by using Aggregation on existing fields, or via Beast Modes that do the aggregation and oppertaions needed (like if you need a ratio or the like, you'll do something like SUM(value1)/SUM(value2))