Appending new data in DOMO
Best Answers
-
Many connectors have the option to either replace or append data each time the data updates. For the ones that dont, you can create a recursive dataflow to snapshot the data overtime. Here is a great recording that walks through how to build a recursive dataflow:
If I solved your problem, please select "yes" above
0 -
Another great option is the dataset copy - you can create a copy with whatever interval you want and have it append.
1 -
A recursive dataflow will allow you to UPSERT your date (update if it already exist or insert a new record if it doesn't). So if you need to re-run your dataset it'll protect against duplicated data. The caveat to recursive dataflows is that by nature as the dataset grows the ETL will take longer to run as there's more data to import. In your case since it's a montly snapshot that likely won't be an issue.
Using dataset copy and setting it to append will be quicker to process but won't protect against duplicated data if it ever runs more than once in a month.
A simpler option is a new feature to MagicETL where you can specify the output method on an Output Dataset tile:
Just input your original dataset and then set it to partition (this will update, insert, or delete records), and make sure you set a partition key for the month and year. You can use a formula tile and add in a Month field using the LAST_DAY function:
LAST_DAY(`date`)
Just make sure you pull the entire month in otherwise it'll overwrite your old whole month with a partial month if it's not entirely pulled in.
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**0
Answers
-
Many connectors have the option to either replace or append data each time the data updates. For the ones that dont, you can create a recursive dataflow to snapshot the data overtime. Here is a great recording that walks through how to build a recursive dataflow:
If I solved your problem, please select "yes" above
0 -
Another great option is the dataset copy - you can create a copy with whatever interval you want and have it append.
1 -
A recursive dataflow will allow you to UPSERT your date (update if it already exist or insert a new record if it doesn't). So if you need to re-run your dataset it'll protect against duplicated data. The caveat to recursive dataflows is that by nature as the dataset grows the ETL will take longer to run as there's more data to import. In your case since it's a montly snapshot that likely won't be an issue.
Using dataset copy and setting it to append will be quicker to process but won't protect against duplicated data if it ever runs more than once in a month.
A simpler option is a new feature to MagicETL where you can specify the output method on an Output Dataset tile:
Just input your original dataset and then set it to partition (this will update, insert, or delete records), and make sure you set a partition key for the month and year. You can use a formula tile and add in a Month field using the LAST_DAY function:
LAST_DAY(`date`)
Just make sure you pull the entire month in otherwise it'll overwrite your old whole month with a partial month if it's not entirely pulled in.
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**0
Categories
- All Categories
- 1.7K Product Ideas
- 1.7K Ideas Exchange
- 1.5K Connect
- 1.2K Connectors
- 292 Workbench
- 4 Cloud Amplifier
- 8 Federated
- 2.8K Transform
- 95 SQL DataFlows
- 600 Datasets
- 2.1K Magic ETL
- 3.7K Visualize
- 2.4K Charting
- 688 Beast Mode
- 43 App Studio
- 38 Variables
- 658 Automate
- 170 Apps
- 441 APIs & Domo Developer
- 42 Workflows
- 5 DomoAI
- 32 Predict
- 12 Jupyter Workspaces
- 20 R & Python Tiles
- 385 Distribute
- 110 Domo Everywhere
- 269 Scheduled Reports
- 6 Software Integrations
- 112 Manage
- 109 Governance & Security
- 8 Domo University
- 25 Product Releases
- Community Forums
- 39 Getting Started
- 29 Community Member Introductions
- 98 Community Announcements
- Domo Community Gallery
- 4.8K Archive