Is it possible with Magic ETL to append a dataflow with a dataflow that has the same source just more updated data?
Yes you can append with Magic ETL. What I am not sure of from your post is if you want to append or if you want to update? If you are appending, sounds like you are creating a stacked Data source with date stamped versions of the same datasource? If so, bring the data source in, add a date field to identify it. for the out flow - select the Append function in the Edit columns section. I have a simple version that looks like the following:
The first two use the same data source and I have added a date to them so when appended I ge the stacked dataset one with the updated data.
That's exactly what I was trying to do, thank you!
Hi,
I want to append a static dataset A and an live one B (that is updated every week). If i used this method, will i lose my first week data once i approach the second week? Thank you
Let's say we are on 1st week June 2017
A contains data Feb 2017 - May 2017
B contains data 1st week Jun 2017
Append will result Feb 2017 - 1st week Jun 2017
2nd week June 2017
A still contains data Feb 2017 - May 2017
B contains data 2nd week Jun 2017
Append --> will i lose my 1st week Jun 2017?
Thank you
@user05527 wrote: Hi, I want to append a static dataset A and an live one B (that is updated every week). If i used this method, will i lose my first week data once i approach the second week? Thank you Let's say we are on 1st week June 2017A contains data Feb 2017 - May 2017B contains data 1st week Jun 2017Append will result Feb 2017 - 1st week Jun 2017 2nd week June 2017A still contains data Feb 2017 - May 2017B contains data 2nd week Jun 2017Append --> will i lose my 1st week Jun 2017? Thank you
Did you ever get a response or figure out a way to do this and keep all data intact? This is my exact scenario.
Thanks!
--Nick
It's possible with a Recursive DataFlow: https://knowledge.domo.com/Prepare/DataFlow_Tips_and_Tricks/Creating_a_Recursive%2F%2FSnapshot_ETL_DataFlow
There's an append ETL option for Redshift that's in beta per this domopalooza2020 webinar: https://console.on24.com/view/presentation/flash/ended.html?eventid=2218249&key=BC29488B0C99F2B34D25E651FCF857FE&text_language_id=en&powered-by-on24-visibility=Yes.
Thanks @jimmy I also found a way to do this with the DOMO Data Copy connector and an ETL with filters. This method was more straightforward for the team that was going to manage the process going forward after it was established than the recursive MySQL flow.
@PacoTaco are there any docs explaining how to do that with the Data Copy connector and an ETL with filters?
@jimmy Dataset Copy is just a type of Domo connector. It will copy a dataset. Set it to Append instead of Replace.
Use Filters to limit the source dataset to just the subset of data you want to Append.