Updating a column or deleting rows of a table in SQL transform (saving to the source table)
Hello Everyone,
I have just started working in Domo for couple of weeks and currently working on the ETL transformation.
My requirement is to update or delete a column of a table using SQL transfrom and save the result to the same table. I have tried this using as individual transforms and in a Stored procedure, in vain. Help me in this.
Analysis:
Output is mandatory for SQL transform in DOMO and when i assigh the same name of the source to output, a new target table is created with the same name (while i want to impact the source table)
Note: I am surprised that DOMO created a duplicate table with same name as it can cause some conflits/misunderstanding later. I am curious to know its usecase, as I think creating another object with same name is not desirable.
Thank you for your time.
Regards,
Kaushik
Best Answer
-
Thanks for sending that over.
I see what you are trying to do now. What you are doing is correct (Update and Delete). The issue you are running into is that the output data set is not the same as the input data set, even if you name the output data set the same as the input data set (behind the scenes they are different tables).
This is intentional on Domo's part for a few reasons:
1) We want to preserver and protect the source data sets
2) If you had the same output data set as the input data set then it would create a recursive data flow that would never stop running. This is because when the input data set is updated it kicks of the data flow.
Could you help me understand the use case better? If I know what you are trying to accomplish perhaps we can recommend another approach.
0
Answers
-
Kaushik,
Do you have any screen shots you can upload as an image in the post or as an attachment so we get a clearer understanding of what you're trying to accomplish?
Thanks!
Dani
0 -
I have attached the screenshot of what i am trying to do.
I am trying to update one row for a student_id and delete one row for a student id in the source table kvk_temo_studnet_target.
After i run that process, I get 2 datasets with the name kvk_temo_studnet_target while I would like to have one kvk_temo_studnet_target with the update and delete reflected.
Note: I updated your previous post as solution by mistake
0 -
HI Kaushik,
Could you also open each of the transforms and attached a screen shot of the code you are using? It will help me to understand what is happening and provide some guidance.
Thank You!
0 -
Hello,
Apologies, I have included the screenshots of individual transforms as requested.
As they were simple queries i did not post them earlier, but i see your point.
To reiterate, I want to update/delete the source table within the same SQL dataflow task (if there is a way).
Please let me know if you need anything. Thank you.
0 -
Thanks for sending that over.
I see what you are trying to do now. What you are doing is correct (Update and Delete). The issue you are running into is that the output data set is not the same as the input data set, even if you name the output data set the same as the input data set (behind the scenes they are different tables).
This is intentional on Domo's part for a few reasons:
1) We want to preserver and protect the source data sets
2) If you had the same output data set as the input data set then it would create a recursive data flow that would never stop running. This is because when the input data set is updated it kicks of the data flow.
Could you help me understand the use case better? If I know what you are trying to accomplish perhaps we can recommend another approach.
0 -
Thank you for the response. It helped.
My scenario was to have a history of data with a week of delta data appened (after removing the overlap in history data). I was able to acheive this with recursive data flow.
However, delete and update of a table in the dataflow was something I was not able to crack, and hence the question. Thanks for the help!
0 -
Hi,
I think that you are trying to accomplish an upsert. Update historical data with Delta data and add new rows if they exist, assuming that the historical table and delta table schemas are identical. This will give you a new output dataset containing the historical data upserted with the delta data.
You cas use the following 2 transforms in a RedShift dataflow:
Transform 1:
DELETE FROM Historical_Data
USING Delta_Data WHERE Historical_Datal.id = Delta_Data.Id*Uncheck "Generate Output Table"
Transform 2:
insert into Historical_Data
select * from Delta_Data*Uncheck "Generate Output Table"
Output Dataset :
*Dataset Name : UpSerted_Data
Select * Historical_Data
There are some changes that can be made to accommodate various sceneries.Hope this helps.
Ricardo Granada
MajorDomo@Lusiaves
**If the post solves your problem, mark it by clicking on "Accept as Solution"
**You can say "Thank you" by clicking the thumbs up in the post that helped you.1
Categories
- All Categories
- 1.8K Product Ideas
- 1.8K Ideas Exchange
- 1.5K Connect
- 1.2K Connectors
- 300 Workbench
- 6 Cloud Amplifier
- 8 Federated
- 2.9K Transform
- 100 SQL DataFlows
- 616 Datasets
- 2.2K Magic ETL
- 3.9K Visualize
- 2.5K Charting
- 738 Beast Mode
- 57 App Studio
- 40 Variables
- 685 Automate
- 176 Apps
- 452 APIs & Domo Developer
- 47 Workflows
- 10 DomoAI
- 36 Predict
- 15 Jupyter Workspaces
- 21 R & Python Tiles
- 394 Distribute
- 113 Domo Everywhere
- 275 Scheduled Reports
- 6 Software Integrations
- 124 Manage
- 121 Governance & Security
- 8 Domo Community Gallery
- 38 Product Releases
- 10 Domo University
- 5.4K Community Forums
- 40 Getting Started
- 30 Community Member Introductions
- 108 Community Announcements
- 4.8K Archive