Commiting data before update in dataflow

Hi, i would like to know if there is a way to commit data before getting update in

dataflow.

We have use kafka to push records to dataset from file.  with kafka we can set up how many records we expect so before pushing to dataset we can confirm  the number of records in the file and commit it .

Now we would like to avoid having file and push in kind of constant stream the data to our dataset/dataflow.

Let's say our stream data consist of 100 records.

If only 50 records get consume and 50 records are delayed how we can avoid data to be pushed in dataset until all 100 records are ready?

Comments

This discussion has been closed.