Reverting Back to the old dataset version
Best Answers
-
Here are the click steps for a dataflow dataset:
Locate the dataset you want and click it.
And here are the click steps for a non-dataflow dataset:
Locate the dataset you want and click it.
If I solved your problem, please select "yes" above
0 -
This only works if the dataset isn't the output from a dataflow, irrespective of recursion. So basically you can retrieve lost data via the history tab on any raw connector from the appstore, jupyter connectors, workbench connectors, etc.
Also, restoring a dataflow to a previous version and re-running the dataflow only restores lost data if the data loss was a result of the code change and the datasets powering the dataflow had not run/updated since the data loss.
2
Answers
-
Yes if you click on the three dots in the dataset history on the version you want to revert to you can find the option.
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**0 -
I don't see the dataset history option.
0 -
Where do I find the dataset history? I am not able to find that. Is it an access issue? I have admin access.
0 -
Here are the click steps for a dataflow dataset:
Locate the dataset you want and click it.
And here are the click steps for a non-dataflow dataset:
Locate the dataset you want and click it.
If I solved your problem, please select "yes" above
0 -
Just a side note, this only works if the dataset is not recursive. If you had a recursive dataset you can't bring back those rows with the 3 dots. You would have to use another data recovery method to restore them.
**If this answer solved your problem be sure to like it and accept it as a solution!
0 -
This only works if the dataset isn't the output from a dataflow, irrespective of recursion. So basically you can retrieve lost data via the history tab on any raw connector from the appstore, jupyter connectors, workbench connectors, etc.
Also, restoring a dataflow to a previous version and re-running the dataflow only restores lost data if the data loss was a result of the code change and the datasets powering the dataflow had not run/updated since the data loss.
2
Categories
- All Categories
- Product Ideas
- 2.1K Ideas Exchange
- Connect
- 1.3K Connectors
- 309 Workbench
- 7 Cloud Amplifier
- 10 Federated
- Transform
- 663 Datasets
- 119 SQL DataFlows
- 2.3K Magic ETL
- 823 Beast Mode
- Visualize
- 2.6K Charting
- 86 App Studio
- 46 Variables
- Automate
- 194 Apps
- 483 APIs & Domo Developer
- 87 Workflows
- 23 Code Engine
- AI and Machine Learning
- 23 AI Chat
- 4 AI Projects and Models
- 18 Jupyter Workspaces
- Distribute
- 117 Domo Everywhere
- 283 Scheduled Reports
- 11 Software Integrations
- Manage
- 143 Governance & Security
- 11 Domo Community Gallery
- 49 Product Releases
- 13 Domo University
- Community Forums
- 41 Getting Started
- 31 Community Member Introductions
- 116 Community Announcements
- 5K Archive