Datasets stuck refreshing
1. So I have a SQL Postgres database that I am uploading to domo via Workbench via 6 ODBS different datasets. (lets call them 1, 2, 3, 4, 5, 6) Each dataset is updated every 30mins from the Workbench - these work like clockwork without a problem.
2. Each dataset is then modified in ETL with some calculations and form in turn 1a, 2a, 3a, 4a, 5a, 6a. These Dataflows are automatically triggered each 30mins by the ODBS refreshes. Normally an update takes 8-12 secs. These datasets regularly get stuck on "running" and go on "running" for an hour or so, then cancel and restart. I would like to be able to troubleshoot and/or understand what causes it to get stuck like that because in turn it causes following dataflows to get stuck as well...
3. 1a, 2a, 3a, 4a, 5a, 6a are then combined into two separate datasets with the "join" command and other calculation (lets call them X and Y). X and Y can be triggered by an update in any of the 1a, 2a, 3a, 4a, 5a, 6a datasets. But the same thing happens they get stuck for an hour without refreshing - Most likely it is caused by the 1a, 2a, 3a, 4a, 5a, 6a getting stuck but just to confirm I am asking here....
anyone with any suggestions will be very welcome!
Thank you!
Comments
-
It sounds like an action in your ETL is just getting hung up, after which the process resets.
Have you viewed the dataflow history tab that shows what happened at each step? This screen is helpful to view where breakdowns happen and why. On the last column, status, the green or red Success or Failed icon is a link. If you click on that it will take you to the status of each step in the dataflow. Hover over each step status to find the tooltip error message. My guess is it's on a join step.
Let us know how it goes.
Aaron
MajorDomo @ Merit Medical
**Say "Thanks" by clicking the heart in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"0 -
The problem is I don't get fails - what happens is that the the etl runs for an hour or so (50min to 1hr 20min) and then it cancels and autorestarts and then in most cases it goes right through automatically - but sometimes it can again hang for an hour. If I click on details - nothing shows because it did not"fail" it autocancelled...
0 -
That's frustrating. And running dataflows don't show that detail.
One of my dataflows just now had the same problem and I can't tell either. Usually it runs in a few minutes, but this one autocancelled at 45 minutes
This almost sounds like a processing engine problem. Like a system hiccup, especially if those dataflows are all the same version. I don't think users have the details available to diagnose system problems like that.
Aaron
MajorDomo @ Merit Medical
**Say "Thanks" by clicking the heart in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"0 -
well my problem is I am using a free account for now and the support is refusing to talk to me anylonger because I have already asked too many questions... so hence I am here trying to get community help... so not sure what I should try next...
0 -
Domo stores a bunch of metadata about each dataflow run.
How long it takes to load each dataset (ms)
How long it takes to run each transformation
How long it takes to load the output to permanent storage
How many rows are loaded
Timestamps at each step
How many rows are written
Etc.
I did some more research on the similar error that I had this morning, and my delays were related to dataset load times. You get a hint of this when you can see how many rows were loaded compared to the other successful runs.
I can't say that my failure is the same as yours, but I'm starting to think it has to do with failures to finish loading data into the dataflow for processing. There's likely nothing you or I can do about the system not loading your data properly for processing, especially when it's intermittent like that, except to lobby for improved error handling and debugging.
Maybe someone out there in the Dojo has some other ideas.
Aaron
MajorDomo @ Merit Medical
**Say "Thanks" by clicking the heart in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"0 -
so far my datasets are dealing with a minimal number of rows that are added each time... i would say at most 10 rows would be added to a dataset on each refresh... as my SQL updated every 10 mins and the workbench every 30 - meaning if everything goes according to plan 3 rows of data is added every 30 mins but if the dataflow gets stuck and only updated every hour or hour and a half then additional rows would be added on the next successful refresh... ultimately my SQL DB is very small so far - I am not dealing with refreshing millions of rows... so it is very strange behaviour.... in any event I appreciate all the help you have provided so far!
0
Categories
- All Categories
- 1.8K Product Ideas
- 1.8K Ideas Exchange
- 1.6K Connect
- 1.2K Connectors
- 300 Workbench
- 6 Cloud Amplifier
- 9 Federated
- 2.9K Transform
- 102 SQL DataFlows
- 626 Datasets
- 2.2K Magic ETL
- 3.9K Visualize
- 2.5K Charting
- 753 Beast Mode
- 61 App Studio
- 41 Variables
- 692 Automate
- 177 Apps
- 456 APIs & Domo Developer
- 49 Workflows
- 10 DomoAI
- 38 Predict
- 16 Jupyter Workspaces
- 22 R & Python Tiles
- 398 Distribute
- 115 Domo Everywhere
- 276 Scheduled Reports
- 7 Software Integrations
- 130 Manage
- 127 Governance & Security
- 8 Domo Community Gallery
- 38 Product Releases
- 11 Domo University
- 5.4K Community Forums
- 40 Getting Started
- 30 Community Member Introductions
- 110 Community Announcements
- 4.8K Archive