What is a standard workflow for using the Stream API?

I have a OLTP database which I would like to stream/append hourly (new) data to my domo dataset.  Since I won't be replacing the dataset in it's entirety, it seems like using the Stream API to push data is the best approach.  There are no connectors for cosmosdb in the case to pull from Domo side.


The API documentation and examples of using the StreamAPI are a bit incomplete IMHO with regard to typical use-cases.  They seem to just show example, self-documented calls to the API that exhibit that the API works verses workflows on using the API.  So I am reaching out to see how people typically use the Stream API.


It seems like this is the typical push cycle...


1. get or create Stream.  If one exists and you create another you just get increasing number of Datasets in your Domo data center even if your Dataset definition as part of the API call to APPEND.   It is unclear in the documentation if you are supposed to create a new Stream every time you want to APPEND data to your DataSet.

2. create a new Execution ... although I seem to get this error if re-using a Stream... {"status":400,"statusReason":"Bad Request","message":"Infrastructure

problem (429): ","toe":"HORAICUIK1-SF5NO-RP4IA"}

3. uploadPart... the new appending data from my internal OLTP dataset.

4. commit

5. wait some interval

6. repeat


If I don't use Streams,... it is unclear to me the best approach to filling my DataSet.  I certainly don't want to replace a DataSet every hour with OLTP data which is growing. 


Thanks in advance.



  • Jarvis
    Jarvis Contributor

    In a case like this, the best option is to reach out to your Customer Success Manager.

    They will have access to resources that are skilled with the Streams API and will be able to advise the best course of action to take for your data. 

    I would also suggest submitting product feedback via the product feedback channel (from your launch menu), to let them know about the incomplete documentation. I've learned that their team is very responsive with customer feedback and are very quick to update documentation that needs more information. 

This discussion has been closed.