Time taken for ETL Flow

Hi 

 

I have a question regarding the time taken for running an ETL flow. I have this AWS object, which has about 90 Million rows and is getting updated by about 10k to 20 k rows on a daily basis. And we have a scheduled update design to run daily.

 

The thing is- we notice that even though only 10k rows are getting added / updated on an average daily, the time taken for the object to run, and the subsequent ETLs to run remains same (if not increasing by a margin of 30 mins). We would like to know why this is happening- is it the way DOMO tool is designed or is there a workaround for this?

 

Have attached the screenshot of the history log, where I have highlighted the number of rows updated and the time taken. Even though the difference in the rows updated between two days is only about 10 K in an average, the time for dataflow/object to run is same.

 

It would be helpful if you can throw some light on this

Comments

  • cwolman
    cwolman Contributor

    The majority of the time is loading the rows into the ETL and back into Domo when complete and indexing.  This process needs to happen each time you run an ETL.  I do not know what transforms you need to perform on the data or your data source.  Depending on each may give you some options.

     

    If you are using workbench then maybe you could do the transforms before importing the data and perform an append instead of a replace.

     

    Your other option if the scenario fits would be an upsert.  You will need to contact your CSM about having this functionality enabled for your instance.

     

    Chris


    -----------------
    Chris