Pydomo Stream API upload_part skipping files?

I am using pydomo to upload several gzipped files in parallel. (These gzipped csv files are parts of one large SQL table, broken approximately evenly.) However, once the execution is commited, there seems to be fewer rows than the original table. The difference in the number of rows seems to amount to the number of rows in one of the files meant to be uploaded.
Just like in the stream api example, I create an stream, I create an execution for that stream, and I use streams.upload_part() with the created stream and execution. Is there a way to return the status of streams.upload_part(), so that I know which files have failed?
Thanks!
Comments
-
Not sure abut the issue specifically, but our company open sourced a python script we wrote to use the streaming API. Maybe there's some useful code in there, or feel free to use it as is.
0
Categories
- All Categories
- 2K Product Ideas
- 2K Ideas Exchange
- 1.6K Connect
- 1.3K Connectors
- 311 Workbench
- 6 Cloud Amplifier
- 9 Federated
- 3.8K Transform
- 659 Datasets
- 116 SQL DataFlows
- 2.2K Magic ETL
- 816 Beast Mode
- 3.3K Visualize
- 2.5K Charting
- 82 App Studio
- 45 Variables
- 776 Automate
- 190 Apps
- 481 APIs & Domo Developer
- 82 Workflows
- 23 Code Engine
- 40 AI and Machine Learning
- 20 AI Chat
- 1 AI Playground
- 1 AI Projects and Models
- 18 Jupyter Workspaces
- 410 Distribute
- 120 Domo Everywhere
- 280 Scheduled Reports
- 10 Software Integrations
- 144 Manage
- 140 Governance & Security
- 8 Domo Community Gallery
- 48 Product Releases
- 12 Domo University
- 5.4K Community Forums
- 41 Getting Started
- 31 Community Member Introductions
- 114 Community Announcements
- 4.8K Archive