Use recursive dataflows to update datasets
We have 20 to 30 massive tables (+40 million rows) that need to be updated. I am wondering whether a recursive dataflow could be an alternative to update all these datasets. I have researched the option of recursive dataflows, but it seems to me that it is a two step process (first, add the input datasets to create an output dataset, then add it as an input and remove the original dataset), so my question is whether this can run automatically as a single dataflow, so that we do not need to manually update the dataset. What I believe it requires is running the ETL process first to create the output dataset, and then again to append the historical with the updated data. Is this correct?
Does this mean that we have to manually run two ETL processes to update the datasets or can we make this process automatic? Thanks in advance!
Comments
-
Hi,
This was topic that was brought up a while back and had some really thorough explanations from Domo reps. Check out this link:
They do a better job detailing it than I would.
Hope this helps,
ValiantSpur
**Please mark "Accept as Solution" if this post solves your problem
**Say "Thanks" by clicking the "heart" in the post that helped you.0
Categories
- All Categories
- 1.4K Product Ideas
- 1.4K Ideas Exchange
- 1.4K Connect
- 1.1K Connectors
- 278 Workbench
- 4 Cloud Amplifier
- 4 Federated
- 2.7K Transform
- 89 SQL DataFlows
- 557 Datasets
- 2K Magic ETL
- 3.3K Visualize
- 2.3K Charting
- 571 Beast Mode
- 11 App Studio
- 28 Variables
- 579 Automate
- 141 Apps
- 414 APIs & Domo Developer
- 23 Workflows
- 1 DomoAI
- 28 Predict
- 12 Jupyter Workspaces
- 16 R & Python Tiles
- 352 Distribute
- 92 Domo Everywhere
- 258 Scheduled Reports
- 2 Software Integrations
- 92 Manage
- 89 Governance & Security
- 9 Product Release Questions
- Community Forums
- 42 Getting Started
- 28 Community Member Introductions
- 88 Community Announcements
- 4.8K Archive