Reverting Back to the old dataset version
Best Answers
-
Here are the click steps for a dataflow dataset:
Locate the dataset you want and click it.
And here are the click steps for a non-dataflow dataset:
Locate the dataset you want and click it.
If I solved your problem, please select "yes" above
0 -
This only works if the dataset isn't the output from a dataflow, irrespective of recursion. So basically you can retrieve lost data via the history tab on any raw connector from the appstore, jupyter connectors, workbench connectors, etc.
Also, restoring a dataflow to a previous version and re-running the dataflow only restores lost data if the data loss was a result of the code change and the datasets powering the dataflow had not run/updated since the data loss.
2
Answers
-
Yes if you click on the three dots in the dataset history on the version you want to revert to you can find the option.
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**0 -
I don't see the dataset history option.
0 -
Where do I find the dataset history? I am not able to find that. Is it an access issue? I have admin access.
0 -
Here are the click steps for a dataflow dataset:
Locate the dataset you want and click it.
And here are the click steps for a non-dataflow dataset:
Locate the dataset you want and click it.
If I solved your problem, please select "yes" above
0 -
Just a side note, this only works if the dataset is not recursive. If you had a recursive dataset you can't bring back those rows with the 3 dots. You would have to use another data recovery method to restore them.
**If this answer solved your problem be sure to like it and accept it as a solution!
0 -
This only works if the dataset isn't the output from a dataflow, irrespective of recursion. So basically you can retrieve lost data via the history tab on any raw connector from the appstore, jupyter connectors, workbench connectors, etc.
Also, restoring a dataflow to a previous version and re-running the dataflow only restores lost data if the data loss was a result of the code change and the datasets powering the dataflow had not run/updated since the data loss.
2
Categories
- All Categories
- 1.7K Product Ideas
- 1.7K Ideas Exchange
- 1.5K Connect
- 1.2K Connectors
- 292 Workbench
- 4 Cloud Amplifier
- 8 Federated
- 2.8K Transform
- 95 SQL DataFlows
- 602 Datasets
- 2.1K Magic ETL
- 3.7K Visualize
- 2.4K Charting
- 693 Beast Mode
- 43 App Studio
- 39 Variables
- 658 Automate
- 170 Apps
- 441 APIs & Domo Developer
- 42 Workflows
- 5 DomoAI
- 32 Predict
- 12 Jupyter Workspaces
- 20 R & Python Tiles
- 386 Distribute
- 111 Domo Everywhere
- 269 Scheduled Reports
- 6 Software Integrations
- 113 Manage
- 110 Governance & Security
- 8 Domo University
- 30 Product Releases
- Community Forums
- 39 Getting Started
- 29 Community Member Introductions
- 98 Community Announcements
- Domo Community Gallery
- 4.8K Archive