Importing Large Data Set from Mongo DB


Hi, I am having a lot of errors importing data from my mongoDB (using the MongoDB V2 connector). The data contains around 10 million Json documents. The connector is working as I have successfully imported previously and I am able to run the importer when the row count is limited to 100,000.

What is the best path to take to get the whole data set imported?



  • GrantSmith

    Hi @bdx

    You'll want to try and import your data in batches if possible.

    **Was this post helpful? Click Agree or Like below**
    **Did this solve your problem? Accept it as a solution!**
  • bdx
    bdx Member

    Thanks Grant.

    What would you say is the best method?

    I already set the batch size to 5000 with the limit row to 9000000.

    Is it a good option to apply a JSON query filter and run multiple data imports or is there a function in the importer that will check for documents not yet imported?

  • jaeW_at_Onyx

    @bdx it's interesting thaat Domo is choking on 100k rows. I wonder if that's a MongoDB limitation. Domo can ingest multi M row datasets at a time so it's not likely to be a domo infrastructure problem.

    that said, yes, use filters to try to limit just the new documents you're bringing in. You may need to implement a recursive dataflow in Domo to handle the rest of the data.

    Jae Wilson
    Check out my 🎥 Domo Training YouTube Channel 👨‍💻

    **Say "Thanks" by clicking the ❤️ in the post that helped you.
    **Please mark the post that solves your problem by clicking on "Accept as Solution"
  • bdx
    bdx Member

    Thanks JaeW