Rolling 30 Day Formula in magic ETL

edited February 2022 in Magic ETL


I am trying to write the formula above within magic ETL. Column A consists of static values. Column B starts with 0 as the first value and has the formula shown in the image. The formula resets its count every time it gets to a number greater than or equal to 30.

The only way I can find to do this is with a LAG() function that it appears is not available in DOMO.


  • LAG is available as a beast mode however you need to talk you your CSM to get window functions enabled.

    **Was this post helpful? Click Agree or Like below**
    **Did this solve your problem? Accept it as a solution!**
  • I believe I came up with a way for you to do this using multiple rank and window and formula tiles in Magic ETL. If you copy this into a blank Magic ETL, you should be able to see all the tiles and what they are doing.

    {"contentType":"domo/dataflow-actions","data":[{"name":"Test Days","id":"53c4747b-3ef7-4c46-a2c7-7673ee34d0c2","type":"LoadFromVault","gui":{"x":84,"y":156,"color":3238043,"colorSource":null},"dependsOn":[],"removeByDefault":false,"notes":[],"dataSourceId":"c1f4230f-1474-437e-a073-a9e442c0f8ba","executeFlowWhenUpdated":false,"onlyLoadNewVersions":false,"columnSettings":{},"visiblePartitionColumn":null,"versionWindow":null},{"name":"Lag to get previous entry","id":"0b181c69-c181-4f20-b1e3-a70e3993469a","type":"WindowAction","gui":{"x":204,"y":156,"color":null,"colorSource":null},"dependsOn":["53c4747b-3ef7-4c46-a2c7-7673ee34d0c2"],"removeByDefault":false,"notes":[],"partitionedAggregation":false,"additions":[{"name":"Lag","operation":{"type":"OFFSET","operationType":"LAG","column":"DaysPast","amount":1,"defaultValue":null}}],"orderRules":[{"column":"DaysPast","caseSensitive":false,"ascending":true}],"groupRules":[]},{"name":"Calc Day Diff","id":"bfaccf28-96e3-4213-81b4-c952ff226acc","type":"ExpressionEvaluator","gui":{"x":312,"y":156,"color":null,"colorSource":null},"dependsOn":["0b181c69-c181-4f20-b1e3-a70e3993469a"],"removeByDefault":false,"notes":[],"expressions":[{"expression":"`DaysPast` - `Lag`","fieldName":"DayDiff","settings":null}]},{"name":"Running Total Unlimited","id":"ffe767a9-e7bc-4089-b13f-afc61d0444a4","type":"WindowAction","gui":{"x":432,"y":156,"color":null,"colorSource":null},"dependsOn":["bfaccf28-96e3-4213-81b4-c952ff226acc"],"removeByDefault":false,"notes":[],"partitionedAggregation":false,"additions":[{"name":"Runningtotal","operation":{"type":"FRAMED","operationType":"SUM","column":"DayDiff","preceding":null,"following":0}}],"orderRules":[{"column":"DaysPast","caseSensitive":false,"ascending":true}],"groupRules":[]},{"name":"Create Group Number","id":"99283d04-b4f3-4417-9698-cd0f4573429e","type":"ExpressionEvaluator","gui":{"x":552,"y":156,"color":null,"colorSource":null},"dependsOn":["ffe767a9-e7bc-4089-b13f-afc61d0444a4"],"removeByDefault":false,"notes":[],"expressions":[{"expression":"FLOOR(`Runningtotal`/30)","fieldName":"GroupNumber","settings":null}]},{"name":"Create Running Total","id":"4e8f7536-fdf3-4197-9df6-33a6b8052b65","type":"WindowAction","gui":{"x":864,"y":156,"color":null,"colorSource":null},"dependsOn":["cbb1075c-e956-4197-a08d-ea3f3b3e2dca"],"removeByDefault":false,"notes":[],"partitionedAggregation":false,"additions":[{"name":"RunningTotal2","operation":{"type":"FRAMED","operationType":"SUM","column":"DayDiff","preceding":null,"following":0}}],"orderRules":[{"column":"DaysPast","caseSensitive":false,"ascending":true}],"groupRules":[{"column":"GroupNumber","caseSensitive":false}]},{"name":"Test Days 30","id":"e19de3e9-40ba-4a98-b86f-bac8f6d0dbd7","type":"PublishToVault","gui":{"x":960,"y":156,"color":null,"colorSource":null},"dependsOn":["4e8f7536-fdf3-4197-9df6-33a6b8052b65"],"removeByDefault":false,"notes":[],"dataSource":{"guid":"b09b8705-314f-4012-b6a0-a41a6b472ba7","type":"DataFlow","name":"Test Days 30","cloudId":"domo"},"versionChainType":"REPLACE","partitionIdColumns":[]},{"name":"Lag to get previous group","id":"2de813ec-928a-44db-b4e6-f34653d6a63e","type":"WindowAction","gui":{"x":660,"y":156},"dependsOn":["99283d04-b4f3-4417-9698-cd0f4573429e"],"removeByDefault":false,"notes":[],"partitionedAggregation":false,"additions":[{"name":"PreviousGroup","operation":{"type":"OFFSET","operationType":"LAG","column":"GroupNumber","amount":"1"}}],"orderRules":[{"column":"DaysPast","caseSensitive":false,"ascending":true}],"groupRules":[]},{"name":"Add Formula","id":"cbb1075c-e956-4197-a08d-ea3f3b3e2dca","type":"ExpressionEvaluator","gui":{"x":756,"y":156},"dependsOn":["2de813ec-928a-44db-b4e6-f34653d6a63e"],"removeByDefault":false,"notes":[],"expressions":[{"fieldName":"NewGroup","expression":"CASE when `GroupNumber` <> `PreviousGroup` THEN 'Y' ELSE 'N' END","settings":null},{"fieldName":"DayDiff","expression":"case when `NewGroup` = 'Y' THEN 0 ELSE `DayDiff` END","settings":null}]}]}

    Here is a walk through of what I am doing.

    I created a webform of test data using your screenshot that has a single column called DaysPast and used the numbers in your screenshot.

    The first rank & window tile does a lag to get the previous DaysPast value and have it as a second column so that I can calculate the difference.

    The formula tile "calc day diff" is just subtracting dayspast - lag to create a DayDiff column.

    The 2nd rank & window tile creates a running total that is not capped, but will be used to create groups.

    The Create Group Number formula tile takes that running total number and divides it by 30 and uses the floor function to round down. This creates some groups for us that we can partition by eventually.

    The 3rd rank & window tile uses Lag to get the previous group number so that we can identify if we changed groups and reset the daydiff to 0.

    The 3rd formula tile creates a Y/N column to identify when a new group started and then overwrites the daydiff if the new group equals Y

    Finally, our 4th rank & window tile will create a running total of the daydiff but is partitioned by the group number so that we don't go above 30.

    Hope this works for you.

    **Check out my Domo Tips & Tricks Videos

    **Make sure to <3 any users posts that helped you.
    **Please mark as accepted the ones who solved your issue.
  • @MarkSnodgrass

    Thank you for your efforts. I do have SQL experience, but my Domo/ETL knowledge is limited. When you say "If you copy this into a blank Magic ETL" I am not sure where I should be pasting your code.

  • @CTBH if you go to your data center and then click on ETL it will open up an empty Magic ETL workspace. Highlight the code block above and press ctrl+c to copy it and then go to your empty Magic ETL workspace and press ctrl+v to paste it in. This will create all the tiles that I have shown above.

    You can then connect your input dataset to it and change column names as needed. Hope this makes sense.

    **Check out my Domo Tips & Tricks Videos

    **Make sure to <3 any users posts that helped you.
    **Please mark as accepted the ones who solved your issue.
  • @MarkSnodgrass oooooh! Thank you. I thought I needed to paste it into something in order to get it to work. Again, I appreciate your help thus far, I will post back up if after I tinker around with this, hopefully getting it to work, or if I run into any issues.

  • @MarkSnodgrass Just wanted to check back in to say thanks for giving me a boost. The backend data changed a bit since I posed this, and it was with ascending dates instead of descending, so I wasn't just able to plug in what you sent, and didn't quite follow, BUT! the running total/grouping by 30 day did give me an idea to do it (I think?) differently. Instead of doing another Lag function I'm just taking the diff of the diff which allows me to compare 30 day groupings. Side by side which now that I'm typing this I think is what you were saying do? Either way this helped, thanks.

  • @MarkSnodgrass

    After trying to do it a slightly different way, and then coming back to your way it looks like neither is working 100%. The method you posted works for most cases, but starts to break down if you count by ones. If you look in the attached dataset it works up until you get to days past @189. The next time the count should reset is 219. I see why it's doing it, but I can't come up with a good way to fix it.