Trying to append data from databricks
Hello,
My requirement is to pull the data - run a query 90 columns from databricks table for 12 month which is around 2M records of data and append it to the existing dataset via API
Since its the large volume of data, I'm getting error as '"status":400,"statusReason":"Bad Request","message":"Too many non-consecutive part ids. Make consecutive or upload multiple data versions'.
I then tried to run the query by month and append it to the dataset using
domo.ds_create('datasetname')
domo.get('datasetid')
Now I get error 'Python int too large to convert to C long'.
Can someone help me here or provide the proper syntax to append the data daily to the existing dataset using python script that I can run from databricks?
Best Answer
-
Have you tried using the limit and offset parameters in your GET request to extract the data from Domo paginated?
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**0
Answers
-
Have you tried using the limit and offset parameters in your GET request to extract the data from Domo paginated?
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**0 -
Can you give me more details about it please?
0
Categories
- All Categories
- 1.7K Product Ideas
- 1.7K Ideas Exchange
- 1.5K Connect
- 1.2K Connectors
- 292 Workbench
- 4 Cloud Amplifier
- 8 Federated
- 2.8K Transform
- 95 SQL DataFlows
- 602 Datasets
- 2.1K Magic ETL
- 3.7K Visualize
- 2.4K Charting
- 691 Beast Mode
- 43 App Studio
- 39 Variables
- 658 Automate
- 170 Apps
- 441 APIs & Domo Developer
- 42 Workflows
- 5 DomoAI
- 32 Predict
- 12 Jupyter Workspaces
- 20 R & Python Tiles
- 386 Distribute
- 111 Domo Everywhere
- 269 Scheduled Reports
- 6 Software Integrations
- 112 Manage
- 109 Governance & Security
- 8 Domo University
- 30 Product Releases
- Community Forums
- 39 Getting Started
- 29 Community Member Introductions
- 98 Community Announcements
- Domo Community Gallery
- 4.8K Archive