Can I schedule a date flow to run at a specified time?
I created a data flow (thanks to answers from the Community!) which seems to take quite a long time to run manually. I've submitted it to run twice manually and cancelled it after 30 minutes. I haven't run it successfully yet, so I don't even know if it works. If it fails, at least I'll get to find my errors.
I'd like to schedule it to run sometime in the early morning before I get in. Is there a way to do that?
Best Answer
-
Hey @Jim_Medina,
I like to use a simple trick to accomplish this using the NOAA Weather connector. As you know, DataFlows can be set to run when any of their datasources update. An easy way to "schedule" a DataFlow to run at a specific time is to setup a "dummy" datasource that will update at the time you'd like the DataFlow to run.
I like to use the NOAA Weather connector for this since it only requires a zip code to setup and we can leverage the connector scheduling abilities to essentially "schedule" our DataFlow.
To setup the NOAA Weather connector, head to your Data Center and click the "New DataSet" button in the top right. Scroll down and select the NOAA Weather connector (not the weather alerts version). You can then put in any zip code you'd like, click "Next" and then you have the ability to schedule how often you want this connector to update (daily, weekly, every 15 minutes, etc.). Note that the time of day setting is based on UTC time, which will be 5-7 hours ahead of USA timezones.
Then you simply include this new "dummy" weather dataset as an input to your DataFlow and scroll down to check the "Run DataFlow when [X] updates" for the weather dataset. You don't even have to do anything with the weather dataset in the actual dataflow, just ignore it; we're just using it for the scheduled update.
Hope that helps!
3
Answers
-
Hey @Jim_Medina,
I like to use a simple trick to accomplish this using the NOAA Weather connector. As you know, DataFlows can be set to run when any of their datasources update. An easy way to "schedule" a DataFlow to run at a specific time is to setup a "dummy" datasource that will update at the time you'd like the DataFlow to run.
I like to use the NOAA Weather connector for this since it only requires a zip code to setup and we can leverage the connector scheduling abilities to essentially "schedule" our DataFlow.
To setup the NOAA Weather connector, head to your Data Center and click the "New DataSet" button in the top right. Scroll down and select the NOAA Weather connector (not the weather alerts version). You can then put in any zip code you'd like, click "Next" and then you have the ability to schedule how often you want this connector to update (daily, weekly, every 15 minutes, etc.). Note that the time of day setting is based on UTC time, which will be 5-7 hours ahead of USA timezones.
Then you simply include this new "dummy" weather dataset as an input to your DataFlow and scroll down to check the "Run DataFlow when [X] updates" for the weather dataset. You don't even have to do anything with the weather dataset in the actual dataflow, just ignore it; we're just using it for the scheduled update.
Hope that helps!
3 -
@Jim_Medina, did quinnj's reply help answer your question?
0 -
This is an exremely klugee way to employ what is relatively basic functionality and should be a simple 'schedule time'. Apparently this functionality just doesn't exist in Domo.
I'll "accept" this so I won't be annoyed with reminders to "accept": althought it kind of answers my question the solution of acceptable.
1 -
I'm going to revive this since I haven't been able to find a more current thread, but Jim is correct, this isn't a solution.
Has there been any progress with forcing a dataflow to run on a schedule, even if the underlying dataset hasn't changed?
1 -
@Josh-REO A year and a half later, still no upgrade in place. Dataflows are still only directly driven by dataset updates.
Aaron
MajorDomo @ Merit Medical
**Say "Thanks" by clicking the heart in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"1
Categories
- All Categories
- 1.8K Product Ideas
- 1.8K Ideas Exchange
- 1.5K Connect
- 1.2K Connectors
- 296 Workbench
- 6 Cloud Amplifier
- 8 Federated
- 2.9K Transform
- 100 SQL DataFlows
- 614 Datasets
- 2.2K Magic ETL
- 3.8K Visualize
- 2.5K Charting
- 729 Beast Mode
- 53 App Studio
- 40 Variables
- 677 Automate
- 173 Apps
- 451 APIs & Domo Developer
- 45 Workflows
- 8 DomoAI
- 34 Predict
- 14 Jupyter Workspaces
- 20 R & Python Tiles
- 394 Distribute
- 113 Domo Everywhere
- 275 Scheduled Reports
- 6 Software Integrations
- 121 Manage
- 118 Governance & Security
- Domo Community Gallery
- 32 Product Releases
- 10 Domo University
- 5.4K Community Forums
- 40 Getting Started
- 30 Community Member Introductions
- 108 Community Announcements
- 4.8K Archive