Is there a way to limit the number of input records on the ETL dataflow ?
I have 10 GB of data and I want to run a python script on that. But I am getting Out of memory issue because of the size of data. I want to limit the number of records from the input dataset. How to do it? Thanks in advance for suggestions and solutions.