Run of DataFlow Times Out

SLamba
Member
in Connectors
I've got a dataflow that times out whenever I try to run it. The eror message says "Cache file timed out", and it looks to happen somewhere around 10-12M rows. I'm filtering one of the tables to reduce the number of records to process, but the time-out error occurs before it gets to the filter step. Any suggestions?
0
Best Answer
-
Thanks for replying, but I figured out a work around. I just switched the order of the tables in my join, and that sped things up.
1
Answers
-
Thanks for replying, but I figured out a work around. I just switched the order of the tables in my join, and that sped things up.
1
Categories
- 10.5K All Categories
- 8 Connect
- 918 Connectors
- 250 Workbench
- 470 Transform
- 1.7K Magic ETL
- 69 SQL DataFlows
- 477 Datasets
- 193 Visualize
- 252 Beast Mode
- 2.1K Charting
- 11 Variables
- 17 Automate
- 354 APIs & Domo Developer
- 89 Apps
- 3 Workflows
- 20 Predict
- 5 Jupyter Workspaces
- 15 R & Python Tiles
- 247 Distribute
- 63 Domo Everywhere
- 243 Scheduled Reports
- 21 Manage
- 42 Governance & Security
- 174 Product Ideas
- 1.2K Ideas Exchange
- 12 Community Forums
- 27 Getting Started
- 14 Community Member Introductions
- 55 Community News
- 4.5K Archive