-
Export metadata on connector datasets
We need to export metadata for all datasets that use certain connectors. Specifically, we have created a large number of datasets using postgresql and bigquery connectors. We need to export (1) the query used for each dataset, (2) the dataset id for each dataset, and (3) the name of each dataset. I know it is possible to…
-
Configuration Options for Dataset Copy Dataset Connector
I need to copy a dataset from one instance to another. On the first run, I need to copy the entire dataset. Subsequently, only the new/delta data in the source needs to get appended to the dataset in the other instance. I would like to avoid loading a complete refresh to the destination dataset every time new data is added…
-
Redshift Unable to create database.
We are getting the below error when trying to build a Redshift dataflow: We are unable to run previews, and we are unable to run the dataflow itself. It fails every time we try to run. Any idea what is causing this issue?
-
Scheduled Workbench Jobs not Running: The communication object, System.ServiceModel.Channels.Service
We have several scheduled Workbench jobs that are extracting data from SQL Server tables and loading to DOMO data sets. The jobs are failing to run at their scheduled times, but I am able to run them manually without issue. They are showing this error: Strangely, only some of our jobs are showing this error. We have other…
-
MySQL Connector to Read Only Some Tables in a Database rather than All Tables in a Database
We are using a MySQL connector to load data from our Google Cloud SQL servers into DOMO. From what I can tell, it is only possible to connect to a Database, and all the tables within that database become accessible via the connector. Any of the tables in our database can be chosen in this window (see the highlighted…
-
Workbench Job Hanging, Dataset stuck on 'Preparing'
I have a scheduled Workbench job to update a dataset that typically takes 30ish mins to run. The job started yesterday at 1:37pm and has been stuck on 'Preparing..' in the Result field (on the History tab) since then. I just canceled the workbench job and started a new manual run. It started a new run, but the history tab…
-
How to implement commands from the Command Line Interface Tool in Python, specifically api requests
Question: Is it possible to execute command line interface tool operations within Python, specifically the 'list-dataflow' command? Elaboration: The 'list-dataflow' command in the command line interface tool returns JSON structured metadata for a dataflow. My guess is that this command is just making a GET request to a…
-
DataFlow Transform (i.e. Action) Types in API JSON data
My question: What factors determine the type of a dataflow transform in the api JSON data (e.g. GenerateTableAction, SQL, SqlAction)? Background: I am using DOMO's command line tool to get JSON structured metadata about a DataFlow. Specifically, I am using the list-dataflow command: Putting in the DataFlow id and a…
-
Timestamp Convention in JSON File from Command Line DataFlow Request
My question: Does anyone know DOMO's timestamp convention and how these map to normal timing conventions? Background: I am using DOMO's command line tool to get JSON structured metadata about a DataFlow. Specifically, I am using the list-dataflow command: Putting in the DataFlow id and a filepath returns a json file to…