Need help mimicking recursion with partitioned datasets

jimsteph
jimsteph Contributor

I need help implementing a particular use case for partitions: importing selected partitions from a dataset, modifying the records, then outputting the data back into it's original partition (or a new partition) in the original dataset. There are two uses for this I can think of off the bat:

  • A dataset can have a maximum of 1500 partitions. If you set your partition key to record_date you can store just over four years worth of data. One potential way around this is to set your partition key to different values depending on how old the data is, i.e.: if the data is less than two years old set the key to record_date, but if it's older use LAST_DAY(record_date) to stuff each month's records into a single partition; you'd then just need to delete the partitions for that month.
  • Datasets can be large, and partitions can be used to make ETL execution go much quicker. For example, I have a transaction dataset that has half a billion rows, and I really don't want to pull in the entire dataset every time I execute the ETL and write it back out when I'm done. Transactions don't always come in on their record_date, so what I want to do is pull in, say, the past week's worth of transaction data, pull in a week's worth of transactions from the partitioned dataset, compare and deduplicate, then save the existing and new records back into the dataset.

The big problem with partitions is that if there's even a hint of recursion Domo will refuse to run the ETL: you cannot write to a partitioned dataset even if the recursion has nothing to do with it (I like to use recursive loops to increment or decrement dates for backfill purposes). This makes the obvious design patterns impossible.

My question to you, dear friends, is how do I — and by extension you — design an ETL that lets you import from, modify, and output data back to a partitioned dataset? The only thing I can think of off the top of my head is to try to fake Domo out by writing to a dataset in one ETL, rewriting that dataset to a different dataset in a second (and maybe third) ETL, and then importing the different dataset back into the first ETL: recursion with one or two degrees of separation, if you will. My gut feeling is that Domo would somehow track that and stop it from working. Even worse, it's inelegant! Can anyone come up with another way?

Thanks, Jim

Best Answer

  • Sam_Arigato
    Sam_Arigato Member
    Answer ✓

    Good Morning Jim,

    In our experience, we have found that Domo does interfere with that type of ETL loop.


    You could create a recursive dataflow that would run the last 7 days. The output of your recursive could then go into your partitioned dataflow.

    The de-duplicating logic would happen in the recursive dataflow. The recursive dataflow would also have to protect the partitions' integrity, ensuring no stray records or partial partitions exist.

    Hope this helps!

    Brought to you by Arigato Analytics

Answers

  • Sam_Arigato
    Sam_Arigato Member
    Answer ✓

    Good Morning Jim,

    In our experience, we have found that Domo does interfere with that type of ETL loop.


    You could create a recursive dataflow that would run the last 7 days. The output of your recursive could then go into your partitioned dataflow.

    The de-duplicating logic would happen in the recursive dataflow. The recursive dataflow would also have to protect the partitions' integrity, ensuring no stray records or partial partitions exist.

    Hope this helps!

    Brought to you by Arigato Analytics

  • jimsteph
    jimsteph Contributor

    I had to think on this for a bit, but it does make sense: anything that would have needed recursion can be recreated by breaking the ETL into two parts and generating an intermediate dataset. Time to get to work …

    Thanks!