Is it possible with Magic ETL to append a dataflow

Is it possible with Magic ETL to append a dataflow with a dataflow that has the same source just more updated data?

Best Answer

  • Shevy
    Shevy Contributor
    Answer ✓

    Yes you can append with Magic ETL. What I am not sure of from your post is if you want to append or if you want to update? If you are appending, sounds like you are creating a stacked Data source with date stamped versions of the same datasource? If so, bring the data source in, add a date field to identify it. for the out flow - select the Append function in the Edit columns section. I have a simple version that looks like the following:MagicETL.PNG

    The first two use the same data source and I have added a date to them so when appended I ge the stacked dataset one with the updated data.  

    Dojo Community Member
    ** Please like responses by clicking on the thumbs up
    ** Please Accept / check the answer that solved your problem / answered your question.

Answers

  • Shevy
    Shevy Contributor
    Answer ✓

    Yes you can append with Magic ETL. What I am not sure of from your post is if you want to append or if you want to update? If you are appending, sounds like you are creating a stacked Data source with date stamped versions of the same datasource? If so, bring the data source in, add a date field to identify it. for the out flow - select the Append function in the Edit columns section. I have a simple version that looks like the following:MagicETL.PNG

    The first two use the same data source and I have added a date to them so when appended I ge the stacked dataset one with the updated data.  

    Dojo Community Member
    ** Please like responses by clicking on the thumbs up
    ** Please Accept / check the answer that solved your problem / answered your question.
  • That's exactly what I was trying to do, thank you!

  • Hi,

     

    I want to append a static dataset A and an live one B (that is updated every week). If i used this method, will i lose my first week data once i approach the second week? Thank you

     

    Let's say we are on 1st week June 2017

    A contains data Feb 2017 - May 2017

    B contains data 1st week Jun 2017

    Append will result Feb 2017 - 1st week Jun 2017

     

    2nd week June 2017

    A still contains data Feb 2017 - May 2017

    B contains data 2nd week Jun 2017

    Append --> will i lose my 1st week Jun 2017?

     

    Thank you


  • @user05527 wrote:

    Hi,

     

    I want to append a static dataset A and an live one B (that is updated every week). If i used this method, will i lose my first week data once i approach the second week? Thank you

     

    Let's say we are on 1st week June 2017

    A contains data Feb 2017 - May 2017

    B contains data 1st week Jun 2017

    Append will result Feb 2017 - 1st week Jun 2017

     

    2nd week June 2017

    A still contains data Feb 2017 - May 2017

    B contains data 2nd week Jun 2017

    Append --> will i lose my 1st week Jun 2017?

     

    Thank you


     

    Did you ever get a response or figure out a way to do this and keep all data intact?  This is my exact scenario.

     

    Thanks!

    --Nick

  • Thanks @jimmy   I also found a way to do this with the DOMO Data Copy connector and an ETL with filters.  This method was more straightforward for the team that was going to manage the process going forward after it was established than the recursive MySQL flow.

  • jimmy
    jimmy Member

    @PacoTaco are there any docs explaining how to do that with the Data Copy connector and an ETL with filters? 

  • @jimmy Dataset Copy is just a type of Domo connector.  It will copy a dataset.  Set it to Append instead of Replace.  

     

    Use Filters to limit the source dataset to just the subset of data you want to Append.

    Jae Wilson
    Check out my 🎥 Domo Training YouTube Channel 👨‍💻

    **Say "Thanks" by clicking the ❤️ in the post that helped you.
    **Please mark the post that solves your problem by clicking on "Accept as Solution"