Alerts - When Dataset Fails X times in a row.

DataMaven Coach
edited February 2023 in Other Ideas

We are all overwhelmed with email alerts on dataset failures, right?

How often do you open the dataset only to find that it was a single random failure, and has since run successfully?

I think it would help to have a setting where the dataset/dataflow failure alerts only trigger when there have been multiple consecutive failures.

Breaking Down Silos - Building Bridges
**Say "Thanks" by clicking a reaction in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"
7 votes

Submitted · Last Updated


  • jstan
    jstan Contributor

    The thing I have done in the interim is put an alert on datasets that haven't updated in XX minutes and control the minutes on a google sheet by dataset ID. I usually target my production datasets and then can dig into root causes. It updates every 15 minutes. Then I can get a single alert for all datasets I want to track.

  • TheLookout
    TheLookout Contributor
    edited January 2022

    If you're using the DomoStats datasets, you could build a small dataflow to show you dataflows that have failed a specified amount of times since their last successful run.

    I built this real quick so I'd be willing to bet there are optimizations/changes that you could make. The two inputs are the Dataflows and Dataflows History from the DomoStats connector. For most everyone this connector can only run its reports once a day so be aware it could be up to a day behind your live domo data.

    Oh and just in case you've never done it before, you can just copy and paste that text document into a New ETL.

    I uploaded a different version of the text file with a typo corrected.