Use recursive dataflows to update datasets
We have 20 to 30 massive tables (+40 million rows) that need to be updated. I am wondering whether a recursive dataflow could be an alternative to update all these datasets. I have researched the option of recursive dataflows, but it seems to me that it is a two step process (first, add the input datasets to create an output dataset, then add it as an input and remove the original dataset), so my question is whether this can run automatically as a single dataflow, so that we do not need to manually update the dataset. What I believe it requires is running the ETL process first to create the output dataset, and then again to append the historical with the updated data. Is this correct?
Does this mean that we have to manually run two ETL processes to update the datasets or can we make this process automatic? Thanks in advance!
Comments
-
Hi,
This was topic that was brought up a while back and had some really thorough explanations from Domo reps. Check out this link:
They do a better job detailing it than I would.
Hope this helps,
ValiantSpur
**Please mark "Accept as Solution" if this post solves your problem
**Say "Thanks" by clicking the "heart" in the post that helped you.0
Categories
- All Categories
- 1.8K Product Ideas
- 1.8K Ideas Exchange
- 1.6K Connect
- 1.2K Connectors
- 300 Workbench
- 6 Cloud Amplifier
- 9 Federated
- 2.9K Transform
- 102 SQL DataFlows
- 626 Datasets
- 2.2K Magic ETL
- 3.9K Visualize
- 2.5K Charting
- 754 Beast Mode
- 61 App Studio
- 41 Variables
- 693 Automate
- 178 Apps
- 456 APIs & Domo Developer
- 49 Workflows
- 10 DomoAI
- 38 Predict
- 16 Jupyter Workspaces
- 22 R & Python Tiles
- 398 Distribute
- 115 Domo Everywhere
- 276 Scheduled Reports
- 7 Software Integrations
- 130 Manage
- 127 Governance & Security
- 8 Domo Community Gallery
- 38 Product Releases
- 11 Domo University
- 5.4K Community Forums
- 40 Getting Started
- 30 Community Member Introductions
- 110 Community Announcements
- 4.8K Archive