Run of DataFlow Times Out
SLamba
Member
in Connectors
I've got a dataflow that times out whenever I try to run it. The eror message says "Cache file timed out", and it looks to happen somewhere around 10-12M rows. I'm filtering one of the tables to reduce the number of records to process, but the time-out error occurs before it gets to the filter step. Any suggestions?
0
Best Answer
-
Thanks for replying, but I figured out a work around. I just switched the order of the tables in my join, and that sped things up.
1
Answers
-
Thanks for replying, but I figured out a work around. I just switched the order of the tables in my join, and that sped things up.
1
Categories
- All Categories
- 1.4K Product Ideas
- 1.4K Ideas Exchange
- 1.4K Connect
- 1.2K Connectors
- 284 Workbench
- 4 Cloud Amplifier
- 4 Federated
- 2.9K Transform
- 88 SQL DataFlows
- 554 Datasets
- 2.2K Magic ETL
- 3.3K Visualize
- 2.3K Charting
- 562 Beast Mode
- 9 App Studio
- 27 Variables
- 577 Automate
- 140 Apps
- 414 APIs & Domo Developer
- 22 Workflows
- 1 DomoAI
- 28 Predict
- 12 Jupyter Workspaces
- 16 R & Python Tiles
- 350 Distribute
- 90 Domo Everywhere
- 258 Scheduled Reports
- 2 Software Integrations
- 91 Manage
- 88 Governance & Security
- 9 Product Release Questions
- Community Forums
- 42 Getting Started
- 28 Community Member Introductions
- 85 Community Announcements
- 4.8K Archive