Export very large dataset (million or row) in CSV as chunks

Is it possible in DOMO using python or postman to export whole dataset but break files(CSV/Excel) so we can have 5 million rows in each exported file.

Answers

  • Hi @user094816

    I don't believe Domo's API supports pagination / chunking data. You could utilize the API to read the entire dataset and then use your python script to do the splitting logic yourself.

    **Was this post helpful? Click Agree or Like below**
    **Did this solve your problem? Accept it as a solution!**
  • Hi @GrantSmith , Thank you for your feedback. Since domo is slow while exporting data from cards (applying filer of date range to reduce number of rows < 10 ML) for 2 reasons

    1) data is very huge

    2) Many filters applied on card

    I have solved 2nd option and added all filters on ETL itself to reduce the data and generated a new dataset.

    Now I am looking for export this huge dataset 104 M rows with 38.5 GB data size. What is the best approach?

    I tried to filter and export from UI in DOMO Dataset, but it's not exporting filtered rows from dataset.

  • either use dataset views to construct filtered views of your dataset, or use the domo cli https://knowledge.domo.com/Administer/Other_Administrative_Tools/Command_Line_Interface_(CLI)_Tool#section_34, to query-data or export-data and create filtered exports.

    Jae Wilson
    Check out my 🎥 Domo Training YouTube Channel 👨‍💻

    **Say "Thanks" by clicking the ❤️ in the post that helped you.
    **Please mark the post that solves your problem by clicking on "Accept as Solution"