Is there a number of rows restriction in the Domo API query functionality?

I'm using the Domo API query functionality (https://developer.domo.com/docs/dataset-api-reference/dataset#Query%20a%20DataSet) but it seems that it has a 1 million rows cap, is this something that you can turn off or do I have to look for a workaround?

Comments

  • Hi,

     

    I have read something related to limit rows, please check if this helps you.

     

    https://dojo.domo.com/t5/Domo-Developer/Configure-tab-is-missing-in-preview-dataflow/m-p/45246#M1382

    https://dojo.domo.com/t5/Domo-Developer/How-we-can-limit-the-number-of-rows-in-api-call/m-p/39392#M1094

     

    **Say "Thanks" by clicking the heart in the post that helped you.
    **Please mark the post that solves your problem by clicking on "Accept as Solution"

     

    Thanks,

    Neeti

  • WHM
    WHM Contributor

    I use the query API in a number of situations but generally it is to check something like the volume of sales in a dataset for yesterday, so that I know it is a reasonable number and I can send out the reports based on that dataset. I am curious why you are trying to query more than a million rows using the API. If it is absolutely necessary, you might consider a series of calls rather than pulling back all the data at the same time. Loop through a series of values that you swap out in your where clause. e.g. year or region...

     

    I do a lot of this scripting in PowerShell - its easy and free - because we regularly get SSL/TLS communication errors when calling WB.exe. If WB.exe fails to connect to the mothership, it stops without throwing an error and you have no idea it did not run. I am always happy to share code that I use. I have scripts to run WB jobs, send reports, copy PDP rules and "promote" users to social if they have not logged in for 90 days. I recently changed that script to put terminated people into a custom Social group so that I do not lose their history and I can tell if I can expect them back in the instance.

    I hope this helps. Shout anytime if I may be of assistance.

     

    William

     

  • jimmy
    jimmy Member

    I use limit and offset to download tables with more than 1mm rows, like this: 

    SELECT * FROM table LIMIT 1000000 OFFSET 0;
    SELECT * FROM table LIMIT 1000000 OFFSET 1000000;

    Or you can use Domo Java CLI Tool's `export-data` command: https://knowledge.domo.com/Administer/Other_Administrative_Tools/Command_Line_Interface_(CLI)_Tool 

  • I do the same with limit&offset but after fetching ~80000 records DOMO start to return `400:'There was a problem executing the SQL query: Underlying service error: Internal Server Error'` response. Not sure how to handle that. This is the API problem.

  • Checkout the Java CLI, link, and its `export-data` function. It grabs the entire dataset as a .csv file. 

    To use the Java CLI, after adding Java to your path, create a script file called `script.txt` like this:

     

    connect --token {your_token} --server {your_server}
    export-data --id "{ds_id}" --filename "{filename.csv}"
    quit

     

    From the command line run: 

     

    java -jar {jar_path} --script script.txt

     

  • agree with @WHM 's commment... there really isn't a strong reason to export a million rows of data at a time...

     

    if you're trying to export your data mart for the sake of backing up your data mart in the JavaCLI there are tools explicitly designed for this task (save to S3 for example).

     

    also there are tools like writeback connectors for sending data back to your data lake.

    lastly you can use the ODBC connector

     

    if you're exporting data so you can 'play with it in excel' ... consider using a dataset view to subset or aggregate just the data you need to export and create a more manageable process.

     

    if you're exporting data so you can play with it in python or r, consider looking at the jupyter notebook integration.

     

    i just can't fathom a good reason to export a million rows of data at a time via API from any BI product unless it's part of a pipeline ... and again... there are tools designed for that job.

    Jae Wilson
    Check out my 🎥 Domo Training YouTube Channel 👨‍💻

    **Say "Thanks" by clicking the ❤️ in the post that helped you.
    **Please mark the post that solves your problem by clicking on "Accept as Solution"
  • @jaeW_at_Onyx, here are a few reasons why I export 1m+ rows:

    • We don't have Jupyter notebook integration
    • No `anti_join()` in Magic ETL or Magic ETL v2
    • I can do some exploratory data analysis locally in R faster than I can in Domo
This discussion has been closed.