Run of DataFlow Times Out
Options
SLamba
Member
in Connectors
I've got a dataflow that times out whenever I try to run it. The eror message says "Cache file timed out", and it looks to happen somewhere around 10-12M rows. I'm filtering one of the tables to reduce the number of records to process, but the time-out error occurs before it gets to the filter step. Any suggestions?
0
Best Answer
-
Thanks for replying, but I figured out a work around. I just switched the order of the tables in my join, and that sped things up.
1
Answers
-
Thanks for replying, but I figured out a work around. I just switched the order of the tables in my join, and that sped things up.
1
Categories
- All Categories
- 1.5K Product Ideas
- 1.5K Ideas Exchange
- 1.4K Connect
- 1.1K Connectors
- 278 Workbench
- 4 Cloud Amplifier
- 4 Federated
- 2.7K Transform
- 89 SQL DataFlows
- 562 Datasets
- 2K Magic ETL
- 3.4K Visualize
- 2.3K Charting
- 583 Beast Mode
- 13 App Studio
- 28 Variables
- 585 Automate
- 142 Apps
- 415 APIs & Domo Developer
- 27 Workflows
- 1 DomoAI
- 28 Predict
- 12 Jupyter Workspaces
- 16 R & Python Tiles
- 359 Distribute
- 98 Domo Everywhere
- 259 Scheduled Reports
- 2 Software Integrations
- 94 Manage
- 91 Governance & Security
- 9 Product Release Questions
- Community Forums
- 37 Getting Started
- 28 Community Member Introductions
- 89 Community Announcements
- 4.8K Archive