Use recursive dataflows to update datasets
We have 20 to 30 massive tables (+40 million rows) that need to be updated. I am wondering whether a recursive dataflow could be an alternative to update all these datasets. I have researched the option of recursive dataflows, but it seems to me that it is a two step process (first, add the input datasets to create an output dataset, then add it as an input and remove the original dataset), so my question is whether this can run automatically as a single dataflow, so that we do not need to manually update the dataset. What I believe it requires is running the ETL process first to create the output dataset, and then again to append the historical with the updated data. Is this correct?
Does this mean that we have to manually run two ETL processes to update the datasets or can we make this process automatic? Thanks in advance!
Comments
-
Hi,
This was topic that was brought up a while back and had some really thorough explanations from Domo reps. Check out this link:
They do a better job detailing it than I would.
Hope this helps,
ValiantSpur
**Please mark "Accept as Solution" if this post solves your problem
**Say "Thanks" by clicking the "heart" in the post that helped you.0
Categories
- All Categories
- 1.8K Product Ideas
- 1.8K Ideas Exchange
- 1.5K Connect
- 1.2K Connectors
- 296 Workbench
- 6 Cloud Amplifier
- 8 Federated
- 2.9K Transform
- 100 SQL DataFlows
- 614 Datasets
- 2.2K Magic ETL
- 3.8K Visualize
- 2.5K Charting
- 729 Beast Mode
- 53 App Studio
- 40 Variables
- 677 Automate
- 173 Apps
- 451 APIs & Domo Developer
- 45 Workflows
- 8 DomoAI
- 34 Predict
- 14 Jupyter Workspaces
- 20 R & Python Tiles
- 394 Distribute
- 113 Domo Everywhere
- 275 Scheduled Reports
- 6 Software Integrations
- 121 Manage
- 118 Governance & Security
- Domo Community Gallery
- 32 Product Releases
- 10 Domo University
- 5.4K Community Forums
- 40 Getting Started
- 30 Community Member Introductions
- 108 Community Announcements
- 4.8K Archive