DataFlow Transform (i.e. Action) Types in API JSON data

My question: What factors determine the type of a dataflow transform in the api JSON data (e.g. GenerateTableAction, SQL, SqlAction)?

Background: I am using DOMO's command line tool to get JSON structured metadata about a DataFlow. Specifically, I am using the list-dataflow command:

Putting in the DataFlow id and a filepath returns a json file to that filepath. Within the json file, there is data concerning the "actions" used to build the dataflow (e.g. loading the datasets, transforming the datasets, and producing outputs). For example, the action of loading a dataset is labeled 'LoadFromVault' in the json data, and the output action is labeled 'PublishToVault'.

My confusion regards the labels assigned to the transform actions. My understanding is that there are two kinds of transforms one can use within the DOMO gui: Table or SQL. But within the json data, I have come across three different labels for the transform actions: 'GenerateTableAction', 'SQL', and 'SqlAction'. Furthermore, there are cases where an action that is a 'Table' action in the gui is labeled as a 'SQL' action in the json data.

So, what determines how these labels are assigned, and how do they differ from one another?


  • Hi @walker_page

    I believe the actions are the different types of steps that are in your MySQL Dataflow. 'GenerateTableAction' would be the first steps when it's reading in your data for use futher down in your dataflow. 'SQL' actions are Table type Transforms or output datasets. 'SqlAction' are SQL type Transforms.

    **Was this post helpful? Click Agree or Like below**
    **Did this solve your problem? Accept it as a solution!**
  • Hi @GrantSmith,

    Thanks for your response. I don't think that is right though. I have a dataflow where Table type Transforms are labeled 'GenerateTableAction' (not 'SQL') and a dataflow where SQL type Transforms are labeled 'SQL' (not 'SqlAction').

  • @walker_page and @GrantSmith

    Both MySQL and Magic will have similar (but not necessarily the same) action definitions, but of course they run in different environments.

    @walker_page if you're trying to dynamically generate MAgic ETL, then just chain all the scripting tiles and take note of their name. The harder part will be assigning a unique ID / GUID that the system will accept.

    I'm not saying it's impossible ... but I doubt you'll be successful.

    One step back. Can you design a pipeline that doesn't require automated ETL generation?

    Jae Wilson
    Check out my 🎥 Domo Training YouTube Channel 👨‍💻

    **Say "Thanks" by clicking the ❤️ in the post that helped you.
    **Please mark the post that solves your problem by clicking on "Accept as Solution"
  • @jaeW_at_Onyx , thanks for your response. My use case isn't to dynamically generate Magic ETL, but rather simply to generate a report (in a google spreadsheet) that contains information about the dataflow. I am extracting data from the json object and I want to have separate tabs in the output spreadsheet--one for Table transforms and one for SQL transforms. But with the label confusion in the json data, I don't know how to do this accurately. Does that make sense?

  • ook... now it just sounds like you're doing an export of a MySQL dtaflow.

    I think we're back to @GrantSmith 's answer.

    In MySQL you can do two types of operations. Either you GenerateATable (which functionally converts your SELECT statement into CREATE TABLE AS ...) or you can write a SQL transform that can do any other type of DML or DDL (UPDATE, ALTER, DELETE etc.)

    Jae Wilson
    Check out my 🎥 Domo Training YouTube Channel 👨‍💻

    **Say "Thanks" by clicking the ❤️ in the post that helped you.
    **Please mark the post that solves your problem by clicking on "Accept as Solution"