resursion

i have a large dataset that is too large on the SQL source to bring in monthly so I have to bring in daily. So, I need to find a way to keep a few months of data in Domo. What would be the best way to do this? Should I use recursion? Is partitioning another option? I'm not too sure how these work. Would someone be able to help me understand the differences and resources so I can learn how to use it?

Tagged:

Best Answer

Answers

  • If you are using Domo Workbench to push your data to Domo, you can use Partitions to just send the new data. Magic ETL also has partitions. You could also use the append option your dataset so that it would continue to add to your Domo dataset.

    **Check out my Domo Tips & Tricks Videos

    **Make sure to <3 any users posts that helped you.
    **Please mark as accepted the ones who solved your issue.
  • Thanks Mark, I am using WorkBench.

  • I would review this KB article on Workbench partitions and see if that works for you.

    https://domo-support.domo.com/s/article/360062446514?language=en_US

    **Check out my Domo Tips & Tricks Videos

    **Make sure to <3 any users posts that helped you.
    **Please mark as accepted the ones who solved your issue.
  • You can bring in portions of your data as a dataset and then create a ETL flow to append each to compile a larger dataset. Say you want 2 years of data. The previous years data you might want to pull on the week, each Saturday. The current year, you might want to pull the current quarter daily and the past quarters weekly.

    Recursion can also be a method used if you can differential and make sure updates are processed. There's an easy to follow recursion example you can Google. With each loop, you would append new records. But what happens if records change? You need to account for that to make sure your resulting set has those changes.

    I use recursion for snapshots. Grab the results of aggregated data (ie grouped and summed) each week so I can report on the changes over time.

    ** Was this post helpful? Click Agree or Like below. **
    ** Did this solve your problem? Accept it as a solution! **

  • Chris_Wolman
    Chris_Wolman Contributor

    Another option could be an append job from workbench and a dataflow with a filter tile to exclude rows older than a certain date. It would depend on if you need to reload previous days data or not.

    Chris

    ** Was this post helpful? Click Agree πŸ˜€, Like πŸ‘οΈ, or Awesome ❀️ below **
    ** Did this solve your problem? Accept it as a solution! βœ”οΈ**

  • Thanks everyone for your contribution. I have started using partitioning and it works great. However, I have another problem. When I download just 3 days of data, it's 2.6M rows, which is huge. I am thinking, if there is a way to build an ETL that can aggregate this data maybe every week or month if needed, or daily, store this, and discard the rest. Is this possible in Domo?

  • Sure. You can aggregate in an ETL (group by) and send the result to a tile that appends a dataset.

    ** Was this post helpful? Click Agree or Like below. **
    ** Did this solve your problem? Accept it as a solution! **

  • Thanks @ArborRose . How then do I discard the previous data?

  • ArborRose
    ArborRose Coach
    Answer βœ“

    Using the filter tile. When you review Domo's recursion example, they set up a dataset. Then use a calculated field to filter what is kept or filtered out. You can flush previous data the same way.

    ** Was this post helpful? Click Agree or Like below. **
    ** Did this solve your problem? Accept it as a solution! **