Custom connector - append all rows with an ID greater than last appended
I am making a custom connector. The data vendor’s api needs me to send the IDs of the rows I want to ingest, and they can then send up to 10-20k rows without a time-out. So, I need to break up my requests in blocks of 10k rows, and give the ID range each block. I can run this every hour, as each hour will be less than 10k rows, and append.
Question is, what’s the best way to query the existing table, grab the highest ID present, add one, then use this as a starting ID for the next pull?
Notes:
- the vendor has a version of the Api that works with dates, but even a single day can cause time outs.
- the feed is json
- I am using append, rather than replace
Comments
-
Can anyone help with this request?
Thanks,
0 -
I've just noticed in the Redshift Advanced connector, there's the ability to pass parameters into your SQL, including:
Enter the query parameter value, it is the initial value for query parameter. The last run date is optional by default it is '02/01/1700' if is not provided. For example: !{lastvalue:_id}!=1,!{lastrundate:start_date}!=02/01/1944
0 -
To be clear, this doesn't solve my issue, as my connector is JSON, not Redshift. Is there a way of achieving something similar with JSON?
0
Categories
- All Categories
- 1.4K Product Ideas
- 1.4K Ideas Exchange
- 1.4K Connect
- 1.1K Connectors
- 282 Workbench
- 3 Cloud Amplifier
- 4 Federated
- 2.8K Transform
- 86 SQL DataFlows
- 548 Datasets
- 2.2K Magic ETL
- 3.2K Visualize
- 2.3K Charting
- 544 Beast Mode
- App Studio
- 26 Variables
- 566 Automate
- 134 Apps
- 411 APIs & Domo Developer
- 21 Workflows
- DomoAI
- 28 Predict
- 12 Jupyter Workspaces
- 16 R & Python Tiles
- 345 Distribute
- 87 Domo Everywhere
- 257 Scheduled Reports
- 1 Software Integrations
- 85 Manage
- 84 Governance & Security
- 8 Product Release Questions
- Community Forums
- 41 Getting Started
- 27 Community Member Introductions
- 81 Community Announcements
- 4.8K Archive