Importing Large Data Set from Mongo DB
Hi, I am having a lot of errors importing data from my mongoDB (using the MongoDB V2 connector). The data contains around 10 million Json documents. The connector is working as I have successfully imported previously and I am able to run the importer when the row count is limited to 100,000.
What is the best path to take to get the whole data set imported?
You'll want to try and import your data in batches if possible.**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**1
What would you say is the best method?
I already set the batch size to 5000 with the limit row to 9000000.
Is it a good option to apply a JSON query filter and run multiple data imports or is there a function in the importer that will check for documents not yet imported?0
@bdx it's interesting thaat Domo is choking on 100k rows. I wonder if that's a MongoDB limitation. Domo can ingest multi M row datasets at a time so it's not likely to be a domo infrastructure problem.
that said, yes, use filters to try to limit just the new documents you're bringing in. You may need to implement a recursive dataflow in Domo to handle the rest of the data.Jae Wilson
Check out my 🎥 Domo Training YouTube Channel 👨💻
**Say "Thanks" by clicking the ❤️ in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"1
- 7.7K All Categories
- 3 Connect
- 919 Connectors
- 244 Workbench
- 477 Transform
- 1.8K Magic ETL
- 60 SQL DataFlows
- 446 Datasets
- 37 Visualize
- 198 Beast Mode
- 2K Charting
- 8 Variables
- 18 Cards, Dashboards, Stories
- 1 Automate
- 348 APIs & Domo Developer
- 82 Apps
- 14 Predict
- 3 Jupyter Workspaces
- 11 R & Python Tiles
- 241 Distribute
- 59 Domo Everywhere
- 241 Scheduled Reports
- 15 Manage
- 36 Governance & Security
- 27 Product Ideas
- 1.1K Ideas Exchange
- Community Forums
- 14 Getting Started
- 1 Community Member Introductions
- 49 Community News
- 18 Event Recordings
- 579 日本支部