Need help mimicking recursion with partitioned datasets
I need help implementing a particular use case for partitions: importing selected partitions from a dataset, modifying the records, then outputting the data back into it's original partition (or a new partition) in the original dataset. There are two uses for this I can think of off the bat:
- A dataset can have a maximum of 1500 partitions. If you set your partition key to record_date you can store just over four years worth of data. One potential way around this is to set your partition key to different values depending on how old the data is, i.e.: if the data is less than two years old set the key to record_date, but if it's older use LAST_DAY(record_date) to stuff each month's records into a single partition; you'd then just need to delete the partitions for that month.
- Datasets can be large, and partitions can be used to make ETL execution go much quicker. For example, I have a transaction dataset that has half a billion rows, and I really don't want to pull in the entire dataset every time I execute the ETL and write it back out when I'm done. Transactions don't always come in on their record_date, so what I want to do is pull in, say, the past week's worth of transaction data, pull in a week's worth of transactions from the partitioned dataset, compare and deduplicate, then save the existing and new records back into the dataset.
The big problem with partitions is that if there's even a hint of recursion Domo will refuse to run the ETL: you cannot write to a partitioned dataset even if the recursion has nothing to do with it (I like to use recursive loops to increment or decrement dates for backfill purposes). This makes the obvious design patterns impossible.
My question to you, dear friends, is how do I — and by extension you — design an ETL that lets you import from, modify, and output data back to a partitioned dataset? The only thing I can think of off the top of my head is to try to fake Domo out by writing to a dataset in one ETL, rewriting that dataset to a different dataset in a second (and maybe third) ETL, and then importing the different dataset back into the first ETL: recursion with one or two degrees of separation, if you will. My gut feeling is that Domo would somehow track that and stop it from working. Even worse, it's inelegant! Can anyone come up with another way?
Thanks, Jim
Best Answer
-
Good Morning Jim,
In our experience, we have found that Domo does interfere with that type of ETL loop.
You could create a recursive dataflow that would run the last 7 days. The output of your recursive could then go into your partitioned dataflow.The de-duplicating logic would happen in the recursive dataflow. The recursive dataflow would also have to protect the partitions' integrity, ensuring no stray records or partial partitions exist.
Hope this helps!
Brought to you by Arigato Analytics
1
Answers
-
Good Morning Jim,
In our experience, we have found that Domo does interfere with that type of ETL loop.
You could create a recursive dataflow that would run the last 7 days. The output of your recursive could then go into your partitioned dataflow.The de-duplicating logic would happen in the recursive dataflow. The recursive dataflow would also have to protect the partitions' integrity, ensuring no stray records or partial partitions exist.
Hope this helps!
Brought to you by Arigato Analytics
1 -
I had to think on this for a bit, but it does make sense: anything that would have needed recursion can be recreated by breaking the ETL into two parts and generating an intermediate dataset. Time to get to work …
Thanks!
0
Categories
- All Categories
- 1.8K Product Ideas
- 1.8K Ideas Exchange
- 1.5K Connect
- 1.2K Connectors
- 300 Workbench
- 6 Cloud Amplifier
- 8 Federated
- 2.9K Transform
- 100 SQL DataFlows
- 616 Datasets
- 2.2K Magic ETL
- 3.9K Visualize
- 2.5K Charting
- 738 Beast Mode
- 57 App Studio
- 40 Variables
- 685 Automate
- 176 Apps
- 452 APIs & Domo Developer
- 47 Workflows
- 10 DomoAI
- 36 Predict
- 15 Jupyter Workspaces
- 21 R & Python Tiles
- 394 Distribute
- 113 Domo Everywhere
- 275 Scheduled Reports
- 6 Software Integrations
- 124 Manage
- 121 Governance & Security
- 8 Domo Community Gallery
- 38 Product Releases
- 10 Domo University
- 5.4K Community Forums
- 40 Getting Started
- 30 Community Member Introductions
- 108 Community Announcements
- 4.8K Archive