We need to export metadata for all datasets that use certain connectors. Specifically, we have created a large number of datasets using postgresql and bigquery connectors. We need to export (1) the qu…
I need to copy a dataset from one instance to another. On the first run, I need to copy the entire dataset. Subsequently, only the new/delta data in the source needs to get appended to the dataset in …
We are getting the below error when trying to build a Redshift dataflow: We are unable to run previews, and we are unable to run the dataflow itself. It fails every time we try to run. Any idea what i…
We have several scheduled Workbench jobs that are extracting data from SQL Server tables and loading to DOMO data sets. The jobs are failing to run at their scheduled times, but I am able to run them …
We are using a MySQL connector to load data from our Google Cloud SQL servers into DOMO. From what I can tell, it is only possible to connect to a Database, and all the tables within that database bec…
I have a scheduled Workbench job to update a dataset that typically takes 30ish mins to run. The job started yesterday at 1:37pm and has been stuck on 'Preparing..' in the Result field (on the History…
Question: Is it possible to execute command line interface tool operations within Python, specifically the 'list-dataflow' command? Elaboration: The 'list-dataflow' command in the command line interfa…
My question: What factors determine the type of a dataflow transform in the api JSON data (e.g. GenerateTableAction, SQL, SqlAction)? Background: I am using DOMO's command line tool to get JSON struct…
My question: Does anyone know DOMO's timestamp convention and how these map to normal timing conventions? Background: I am using DOMO's command line tool to get JSON structured metadata about a DataFl…