Reverting Back to the old dataset version
Best Answers
-
Here are the click steps for a dataflow dataset:
Locate the dataset you want and click it.
And here are the click steps for a non-dataflow dataset:
Locate the dataset you want and click it.
If I solved your problem, please select "yes" above
0 -
This only works if the dataset isn't the output from a dataflow, irrespective of recursion. So basically you can retrieve lost data via the history tab on any raw connector from the appstore, jupyter connectors, workbench connectors, etc.
Also, restoring a dataflow to a previous version and re-running the dataflow only restores lost data if the data loss was a result of the code change and the datasets powering the dataflow had not run/updated since the data loss.
2
Answers
-
Yes if you click on the three dots in the dataset history on the version you want to revert to you can find the option.
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**0 -
I don't see the dataset history option.
0 -
Where do I find the dataset history? I am not able to find that. Is it an access issue? I have admin access.
0 -
Here are the click steps for a dataflow dataset:
Locate the dataset you want and click it.
And here are the click steps for a non-dataflow dataset:
Locate the dataset you want and click it.
If I solved your problem, please select "yes" above
0 -
Just a side note, this only works if the dataset is not recursive. If you had a recursive dataset you can't bring back those rows with the 3 dots. You would have to use another data recovery method to restore them.
**If this answer solved your problem be sure to like it and accept it as a solution!
0 -
This only works if the dataset isn't the output from a dataflow, irrespective of recursion. So basically you can retrieve lost data via the history tab on any raw connector from the appstore, jupyter connectors, workbench connectors, etc.
Also, restoring a dataflow to a previous version and re-running the dataflow only restores lost data if the data loss was a result of the code change and the datasets powering the dataflow had not run/updated since the data loss.
2
Categories
- All Categories
- 1.8K Product Ideas
- 1.8K Ideas Exchange
- 1.5K Connect
- 1.2K Connectors
- 300 Workbench
- 6 Cloud Amplifier
- 8 Federated
- 2.9K Transform
- 100 SQL DataFlows
- 616 Datasets
- 2.2K Magic ETL
- 3.8K Visualize
- 2.5K Charting
- 738 Beast Mode
- 56 App Studio
- 40 Variables
- 684 Automate
- 176 Apps
- 452 APIs & Domo Developer
- 46 Workflows
- 10 DomoAI
- 35 Predict
- 14 Jupyter Workspaces
- 21 R & Python Tiles
- 394 Distribute
- 113 Domo Everywhere
- 275 Scheduled Reports
- 6 Software Integrations
- 123 Manage
- 120 Governance & Security
- 8 Domo Community Gallery
- 38 Product Releases
- 10 Domo University
- 5.4K Community Forums
- 40 Getting Started
- 30 Community Member Introductions
- 108 Community Announcements
- 4.8K Archive