Point New Dataflow to Existing Dataset
Comments
-
Hi @user19085 ,
How you have created new dataflow? You could use the the dataset which is required to create dataflow. WIthout dataset you cant create a dafalow.
However, if you are using SQL to create dataflow, you can copy and use same SQL with diffrent dataset, just change the name of dataset and columns names if reuired.
**Say "Thanks" by clicking the "heart" in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"Thanks,
Neeti
0 -
@user19085 - Are you wanting to have the new dataflow point to the existing dataset as the output and overwrite the existing dataset or are you wanting to use the existing dataset as an input to your new dataflow?
If you're looking to overwrite the existing dataset I don't think that's an option with MagicETL or MySQL ETL since they just create new datasets when the dataflow is first run.
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**0 -
Thanks Grant. Yes, I have a new dataflow and want to point to an existing dataset and overwrite the data. What I'm finding is that it seems that the dataset created by a dataflow is unique to that dataflow. In my mind the target dataset/table should not care where the data is coming from.
0 -
Thanks Neeti, I have an existing dataflow with an existing target dataset. I have created a new dataflow to replace the existing, but what we are finding is when the existing cards are repointed to the new dataset/table most of the beast modes don’t migrate requiring an extensive rebuild of the cards.
0 -
Can you combine the new dataflow into the old dataflow and utilize the old dataflow's output dataset?
If you need to keep track of the existing dataflow data you could create a new output dataset and rename it as the old one (even though it'd be treated differently)
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**0 -
You can alter dataflows if you know how to monitor network traffic and send curl requests. OR you can update the JSON definition of a dataflow using the JavaCLI (my recommendation over CURL or Postman) and the list-dataflow command.
This is definitely one of those ... use at your own risk kind of things.
Creating a workflow where you
1) create a copy of the ETL, let's call it DEV_ETL, iterate and QA. then
2) copy the contents of DEV_ETL into PROD_ETL
will probably give you the results you're looking for while ensuring that you don't accidentally blow up cards while you're testing / iterating.
Jae Wilson
Check out my 🎥 Domo Training YouTube Channel 👨💻
**Say "Thanks" by clicking the ❤️ in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"0 -
@GrantSmith and @Neeti, you could use the Java CLI + techniques shared in this video to point the output dataset of an ETL at a different dataset. ( cc @DataMaven )
Jae Wilson
Check out my 🎥 Domo Training YouTube Channel 👨💻
**Say "Thanks" by clicking the ❤️ in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"1
Categories
- All Categories
- 1.8K Product Ideas
- 1.8K Ideas Exchange
- 1.5K Connect
- 1.2K Connectors
- 300 Workbench
- 6 Cloud Amplifier
- 8 Federated
- 2.9K Transform
- 100 SQL DataFlows
- 616 Datasets
- 2.2K Magic ETL
- 3.9K Visualize
- 2.5K Charting
- 738 Beast Mode
- 56 App Studio
- 40 Variables
- 685 Automate
- 176 Apps
- 452 APIs & Domo Developer
- 47 Workflows
- 10 DomoAI
- 36 Predict
- 15 Jupyter Workspaces
- 21 R & Python Tiles
- 394 Distribute
- 113 Domo Everywhere
- 275 Scheduled Reports
- 6 Software Integrations
- 124 Manage
- 121 Governance & Security
- 8 Domo Community Gallery
- 38 Product Releases
- 10 Domo University
- 5.4K Community Forums
- 40 Getting Started
- 30 Community Member Introductions
- 108 Community Announcements
- 4.8K Archive