When a Dataflow executes with a Snowflake dataset it will use the previously staged dataset, and only copy in a update if the results of the previous data freshness check indicate that it should.
Consequently, the dataflow will miss any updates that have happened since the last Data Freshness check.
Considering the Snowflake Warehouse is already activated during a dataflow, would it make sense to check the freshness of that dataset then? Or add an option to the MagicETL data source to enforce a Freshness a check during the execution?
This would allow a different behaviour for priority Datasets