Our organization hasn't been using Domo but for 6 months now. But I'm already seeing some issues which could arrise due to datasets and dataflows.
Here's our situation: we have a number of datasets coming in from scripts being applied to a SQL database. That data arrives in to our Domo instance, but it's not ready for production. I need to run it through one (or more) ETL runs before it finally gets to a dataset ready for prime time.
Can the Dojo speak on some of the measures your organization takes to make sure the users are consuming the "correct" data and not something which might need some additional work?
What about other datasets users create which might be considered offspring from these "gospel" sets? We have wanted to allow our users some degree of freedom and transparency to work with data as they see fit, but I (as the MajorDomo and having now to deal with what can only be described as mutant datasets) am now having to come up with some sort of plan.
Many thanks!
Brian