Datasets stuck refreshing
1. So I have a SQL Postgres database that I am uploading to domo via Workbench via 6 ODBS different datasets. (lets call them 1, 2, 3, 4, 5, 6) Each dataset is updated every 30mins from the Workbench - these work like clockwork without a problem.
2. Each dataset is then modified in ETL with some calculations and form in turn 1a, 2a, 3a, 4a, 5a, 6a. These Dataflows are automatically triggered each 30mins by the ODBS refreshes. Normally an update takes 8-12 secs. These datasets regularly get stuck on "running" and go on "running" for an hour or so, then cancel and restart. I would like to be able to troubleshoot and/or understand what causes it to get stuck like that because in turn it causes following dataflows to get stuck as well...
3. 1a, 2a, 3a, 4a, 5a, 6a are then combined into two separate datasets with the "join" command and other calculation (lets call them X and Y). X and Y can be triggered by an update in any of the 1a, 2a, 3a, 4a, 5a, 6a datasets. But the same thing happens they get stuck for an hour without refreshing - Most likely it is caused by the 1a, 2a, 3a, 4a, 5a, 6a getting stuck but just to confirm I am asking here....
anyone with any suggestions will be very welcome!
Thank you!
Comments
-
It sounds like an action in your ETL is just getting hung up, after which the process resets.
Have you viewed the dataflow history tab that shows what happened at each step? This screen is helpful to view where breakdowns happen and why. On the last column, status, the green or red Success or Failed icon is a link. If you click on that it will take you to the status of each step in the dataflow. Hover over each step status to find the tooltip error message. My guess is it's on a join step.
Let us know how it goes.
Aaron
MajorDomo @ Merit Medical
**Say "Thanks" by clicking the heart in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"0 -
The problem is I don't get fails - what happens is that the the etl runs for an hour or so (50min to 1hr 20min) and then it cancels and autorestarts and then in most cases it goes right through automatically - but sometimes it can again hang for an hour. If I click on details - nothing shows because it did not"fail" it autocancelled...
0 -
That's frustrating. And running dataflows don't show that detail.
One of my dataflows just now had the same problem and I can't tell either. Usually it runs in a few minutes, but this one autocancelled at 45 minutes
This almost sounds like a processing engine problem. Like a system hiccup, especially if those dataflows are all the same version. I don't think users have the details available to diagnose system problems like that.
Aaron
MajorDomo @ Merit Medical
**Say "Thanks" by clicking the heart in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"0 -
well my problem is I am using a free account for now and the support is refusing to talk to me anylonger because I have already asked too many questions... so hence I am here trying to get community help... so not sure what I should try next...
0 -
Domo stores a bunch of metadata about each dataflow run.
How long it takes to load each dataset (ms)
How long it takes to run each transformation
How long it takes to load the output to permanent storage
How many rows are loaded
Timestamps at each step
How many rows are written
Etc.
I did some more research on the similar error that I had this morning, and my delays were related to dataset load times. You get a hint of this when you can see how many rows were loaded compared to the other successful runs.
I can't say that my failure is the same as yours, but I'm starting to think it has to do with failures to finish loading data into the dataflow for processing. There's likely nothing you or I can do about the system not loading your data properly for processing, especially when it's intermittent like that, except to lobby for improved error handling and debugging.
Maybe someone out there in the Dojo has some other ideas.
Aaron
MajorDomo @ Merit Medical
**Say "Thanks" by clicking the heart in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"0 -
so far my datasets are dealing with a minimal number of rows that are added each time... i would say at most 10 rows would be added to a dataset on each refresh... as my SQL updated every 10 mins and the workbench every 30 - meaning if everything goes according to plan 3 rows of data is added every 30 mins but if the dataflow gets stuck and only updated every hour or hour and a half then additional rows would be added on the next successful refresh... ultimately my SQL DB is very small so far - I am not dealing with refreshing millions of rows... so it is very strange behaviour.... in any event I appreciate all the help you have provided so far!
0
Categories
- All Categories
- 1.7K Product Ideas
- 1.7K Ideas Exchange
- 1.5K Connect
- 1.2K Connectors
- 294 Workbench
- 6 Cloud Amplifier
- 8 Federated
- 2.8K Transform
- 97 SQL DataFlows
- 607 Datasets
- 2.1K Magic ETL
- 3.8K Visualize
- 2.4K Charting
- 707 Beast Mode
- 49 App Studio
- 39 Variables
- 667 Automate
- 170 Apps
- 446 APIs & Domo Developer
- 44 Workflows
- 7 DomoAI
- 33 Predict
- 13 Jupyter Workspaces
- 20 R & Python Tiles
- 391 Distribute
- 111 Domo Everywhere
- 274 Scheduled Reports
- 6 Software Integrations
- 115 Manage
- 112 Governance & Security
- Domo Community Gallery
- 31 Product Releases
- 9 Domo University
- 5.3K Community Forums
- 40 Getting Started
- 30 Community Member Introductions
- 103 Community Announcements
- 4.8K Archive