Comments
-
Thanks JaeW
-
Thanks for the help. Fixed
-
Thanks Mark. I am running this new structure now as a test.
-
Thanks Grant. What would you say is the best method? I already set the batch size to 5000 with the limit row to 9000000. Is it a good option to apply a JSON query filter and run multiple data imports or is there a function in the importer that will check for documents not yet imported?