Trying to append data from databricks
Hello,
My requirement is to pull the data - run a query 90 columns from databricks table for 12 month which is around 2M records of data and append it to the existing dataset via API
Since its the large volume of data, I'm getting error as '"status":400,"statusReason":"Bad Request","message":"Too many non-consecutive part ids. Make consecutive or upload multiple data versions'.
I then tried to run the query by month and append it to the dataset using
domo.ds_create('datasetname')
domo.get('datasetid')
Now I get error 'Python int too large to convert to C long'.
Can someone help me here or provide the proper syntax to append the data daily to the existing dataset using python script that I can run from databricks?
Best Answer
-
Have you tried using the limit and offset parameters in your GET request to extract the data from Domo paginated?
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**0
Answers
-
Have you tried using the limit and offset parameters in your GET request to extract the data from Domo paginated?
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**0 -
Can you give me more details about it please?
0
Categories
- All Categories
- 1.5K Product Ideas
- 1.5K Ideas Exchange
- 1.4K Connect
- 1.1K Connectors
- 280 Workbench
- 4 Cloud Amplifier
- 4 Federated
- 2.7K Transform
- 90 SQL DataFlows
- 563 Datasets
- 2K Magic ETL
- 3.4K Visualize
- 2.3K Charting
- 592 Beast Mode
- 13 App Studio
- 28 Variables
- 588 Automate
- 143 Apps
- 417 APIs & Domo Developer
- 27 Workflows
- 1 DomoAI
- 28 Predict
- 12 Jupyter Workspaces
- 16 R & Python Tiles
- 361 Distribute
- 99 Domo Everywhere
- 260 Scheduled Reports
- 2 Software Integrations
- 94 Manage
- 91 Governance & Security
- 10 Product Releases
- Community Forums
- 37 Getting Started
- 28 Community Member Introductions
- 90 Community Announcements
- 4.8K Archive