Trying to append data from databricks
Hello,
My requirement is to pull the data - run a query 90 columns from databricks table for 12 month which is around 2M records of data and append it to the existing dataset via API
Since its the large volume of data, I'm getting error as '"status":400,"statusReason":"Bad Request","message":"Too many non-consecutive part ids. Make consecutive or upload multiple data versions'.
I then tried to run the query by month and append it to the dataset using
domo.ds_create('datasetname')
domo.get('datasetid')
Now I get error 'Python int too large to convert to C long'.
Can someone help me here or provide the proper syntax to append the data daily to the existing dataset using python script that I can run from databricks?
Best Answer
-
Have you tried using the limit and offset parameters in your GET request to extract the data from Domo paginated?
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**0
Answers
-
Have you tried using the limit and offset parameters in your GET request to extract the data from Domo paginated?
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**0 -
Can you give me more details about it please?
0
Categories
- All Categories
- 1.2K Product Ideas
- 1.2K Ideas Exchange
- 1.3K Connect
- 1.1K Connectors
- 273 Workbench
- Cloud Amplifier
- 3 Federated
- 2.7K Transform
- 78 SQL DataFlows
- 525 Datasets
- 2.1K Magic ETL
- 2.9K Visualize
- 2.2K Charting
- 434 Beast Mode
- 22 Variables
- 510 Automate
- 114 Apps
- 388 APIs & Domo Developer
- 8 Workflows
- 26 Predict
- 10 Jupyter Workspaces
- 16 R & Python Tiles
- 332 Distribute
- 77 Domo Everywhere
- 255 Scheduled Reports
- 66 Manage
- 66 Governance & Security
- 1 Product Release Questions
- Community Forums
- 40 Getting Started
- 26 Community Member Introductions
- 67 Community Announcements
- 4.8K Archive