SQL View of DataSets
Is there a way to view the SQL query of an output dataset in Domo?
We are working on exporting our data to Cognos and it would be nice to see the query we are working with in Domo when recreating it in Cognos.
Thanks,
Erik
Best Answers
-
@EWold It depends on where your dataset comes from. If the dataset is an output of a dataflow, you should be able to see the query in the dataflow editor if it is from a MySQL or Redshift dataflow. Otherwise, I don't believe there is a way to produce the SQL query from a MagicETL.
2 -
Magic ETL isn't performing SQL under the hood so there's no SQL to extract from the ETL. It's all done using Apache Spark. You'd need someone to look through your ETL logic and then write then translate it into SQL.
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**1
Answers
-
@EWold It depends on where your dataset comes from. If the dataset is an output of a dataflow, you should be able to see the query in the dataflow editor if it is from a MySQL or Redshift dataflow. Otherwise, I don't believe there is a way to produce the SQL query from a MagicETL.
2 -
Hi Michelle,
My dataset does come from the output of a DataFlow, but I'm unsure where to see the SQL behind the DataFlow. Where in DataFlow editor can I see it? Below is a basic example of what I'm working with.
Thanks,
Erik
0 -
@EWold , do you have anyone in your organization who knows SQL? The Magic ETL is just a visual representation of SQl, so if there's somebody who's knowledgeable with SQL, they should be able to use the ETL to write an equivalent query.
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**
1 -
Magic ETL isn't performing SQL under the hood so there's no SQL to extract from the ETL. It's all done using Apache Spark. You'd need someone to look through your ETL logic and then write then translate it into SQL.
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**1 -
@RobSomers unfortunately, "The Magic ETL is just a visual representation of SQl" this is not an accurate statement and is potentially misleading.
Although Magic ETL transforms borrow heavily from transforms we are accustomed to performing in SQL and even the Formula Tile allows you to write functions borrowed from MySQL, it is not accurate to assume that the engine underneath Magic ETL is a SQL database.
I believe the functions under the covers are implemented in Java and they are performing operations in more of a streaming text pipeline.
This might be splitting hairs, but it's important to understand because if you think of Magic as a database layer with Tables where you're issuing UPDATE commands against a set of records, it may discourage you from thinking of ways you might try to support Magic as it subdivides your tasks across multiple nodes for parallel processing.
In fact the biggest difference and improvement in Magic over your SQL pipelines (Adrenaline, Redshift, and MySQL) include
1) Magic does not have to load all the data into a table before it can be processing the data. Because the data is moving in a text stream, as first as the first row of data arrives it can start flowing through your ETL tiles.
2) Magic does not have to Index. (this is a double edged sword). Because Magic is not a database engine there is no concept of indexing. therefore JOINs, RANK, and SORT operations would be slower than the equivalent operation performed in Redshift or Adrenaline (assuming all other factors were equal).
Jae Wilson
Check out my 🎥 Domo Training YouTube Channel 👨💻
**Say "Thanks" by clicking the ❤️ in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"3 -
Is there a way to get the Apache Spark output or some other text summary from a Magic ETL? It is very cumbersome to click into each individual icon to see what columns were selected/renamed, how things were joined, and all the different potential functions. We are trying to move some of our modeling into dbt and also build out a dictionary of metrics, and our ETLs generally include 5-13 inputs.
1 -
Similar use case for data lineage. We are working on establishing lineage to understand business transformations that are embedded in Domo and most of this is done as part of the Dataflows using Magic ETL. As mentioned by others, there seems to be no way to review and validate the transformations within each data flow other than clicking on each individual icon. If not SQL output, is there any other format like json dump Or are there APIs that we can leverage to generate documents/SQL/in general explain what transformations the data undergoes within Domo dataflows.
0
Categories
- All Categories
- 1.8K Product Ideas
- 1.8K Ideas Exchange
- 1.5K Connect
- 1.2K Connectors
- 300 Workbench
- 6 Cloud Amplifier
- 8 Federated
- 2.9K Transform
- 100 SQL DataFlows
- 616 Datasets
- 2.2K Magic ETL
- 3.9K Visualize
- 2.5K Charting
- 738 Beast Mode
- 57 App Studio
- 40 Variables
- 685 Automate
- 176 Apps
- 452 APIs & Domo Developer
- 47 Workflows
- 10 DomoAI
- 36 Predict
- 15 Jupyter Workspaces
- 21 R & Python Tiles
- 394 Distribute
- 113 Domo Everywhere
- 275 Scheduled Reports
- 6 Software Integrations
- 124 Manage
- 121 Governance & Security
- 8 Domo Community Gallery
- 38 Product Releases
- 10 Domo University
- 5.4K Community Forums
- 40 Getting Started
- 30 Community Member Introductions
- 108 Community Announcements
- 4.8K Archive