Comments
-
@DavidChurchman yes, that is correct, my apologies. Formula seems to work with the test dataset but not the actual so I will need to do some digging. Thank you!
-
@DavidChurchman sorry, I did add some rows just to test your formula since it wasn't working on my actual dataset. I now realize that it's working as intended!! Sorry. I still don't know why it won't work with my dataset but I will look into it.
-
@DavidChurchman I think this is the right track, but it's not perfect. The second to last row (A, 3, M) should be 0.33 since it's one row with that Network/Hour/Date combo and 3 avails.
-
This feature really needs to be added. Almost no point in migrating our dashboards to apps if users can't export content
-
Just updating everyone, this feature is now available in beta through Apps.
-
This is exactly what I needed, I always forget about this tile. Thank you!
-
@BraydenJ I don't quite remember how I solved for this, but I believe I ended up opting to do the calculation as a beast mode instead of using ETL. Good luck
-
Thank you—I was on the right track doing this, but how would I also include rows that are not in both datasets? @MarkSnodgrass
-
@ColemenWilson Ugh I apologize — what makes this complicated is that the Name field is not unique. So there is no unique identifier for each row across both sources.
-
Thank you — just needed to edit the card interaction to drill down instead of filter.
-
Any other ideas?
-
I suppose I could join the 4 datasets together each with an associated constant and then filter each card by the correct constant.
-
Not really, since the different data sets are created in a dataflow with various different joins and aggregations.
-
Incredibly helpful, thank you.
-
What would the join key be for the second join tile?
-
Hi, just bumping this as a new error has come up. Now I'm getting an error for this formula tile that says "failed to convert value 'NaN' from type 'Floating Decimal' to type 'Fixed Decimal'" It's my understanding that this is “not a number” but how would I go about debugging this?
-
@MichelleH @RobSomers extremely helpful, thank you for telling me about this feature!!
-
Interesting, reading about it now. Thank you @RobSomers Is there a way to toggle the variable within the dashboard without going into analyzer? The dashboard is meant to be a user interface used by sales teams without any need (or want) for them to open the cards in analyzer.
-
@GrantSmith figured it out, I sorted the table by Total and aggregated through there, thank you
-
Hi @GrantSmith thanks for the quick response. Unfortunately the table remains as is even when aggregating the total by sum. I have a pie chart that sums up everything correctly (i.e. Connected TV has 49,585), am I missing something? Appreciate the help.
-
Great, thank you!
-
Thank you everyone for your responses, especially @MarkSnodgrass that is a very clever implementation that I may find use for in the future. I don't think these solutions can apply but thank you.
-
@GrantSmith thank you. Unfortunately the dataset is far too large and has far too many different numbers listed in that column to use the split function.
-
@ST_-Superman-_ that's a really smart solution thanks. But unfortunately yeah, my initial thought process wouldn't work in this problem because of the example I depicted in ID #2. I only need one row from that grouping.
-
Time is in this format 07:32:39 You brought up a good point and I need to rethink the problem. I have a dataset that looks like this: ID Time 1 07:32:39 1 07:32:40 1 07:32:41 1 07:32:41 1 07:32:42 1 14:46:59 1 14:47:00 1 14:47:01 2 01:04:29 2 01:04:30 2 01:04:31 2 01:04:31 I need only one from each section/time range.…
-
Thank you so much!
-
Great, thank you. How would I approach the logic of removing duplicates only within a unique ID? I don't want to remove all duplicates, just `Num`s that appear multiple times within a corresponding ID (see example). @RobSomers thanks!
-
@GrantSmith I got it now. Thanks for the concise explanation!
-
@GrantSmith Ok, this has been great help and I understand the logic. I did all the steps but I'm hung up on the end, how do I find the sum?
-
@GrantSmith Hi, thanks for the in depth answer. Could you elaborate on what you mean by a Calendar data set? Also maybe could you provide a pseudo ETL to help me visualize the dataflow?