Run of DataFlow Times Out

SLamba
Member
in Connectors
I've got a dataflow that times out whenever I try to run it. The eror message says "Cache file timed out", and it looks to happen somewhere around 10-12M rows. I'm filtering one of the tables to reduce the number of records to process, but the time-out error occurs before it gets to the filter step. Any suggestions?
0
Best Answer
-
Thanks for replying, but I figured out a work around. I just switched the order of the tables in my join, and that sped things up.
1
Answers
-
Thanks for replying, but I figured out a work around. I just switched the order of the tables in my join, and that sped things up.
1
Categories
- All Categories
- 2K Product Ideas
- 2K Ideas Exchange
- 1.6K Connect
- 1.3K Connectors
- 311 Workbench
- 6 Cloud Amplifier
- 9 Federated
- 3.8K Transform
- 660 Datasets
- 117 SQL DataFlows
- 2.2K Magic ETL
- 816 Beast Mode
- 3.3K Visualize
- 2.5K Charting
- 84 App Studio
- 46 Variables
- 778 Automate
- 190 Apps
- 482 APIs & Domo Developer
- 83 Workflows
- 23 Code Engine
- 41 AI and Machine Learning
- 20 AI Chat
- 1 AI Playground
- 2 AI Projects and Models
- 18 Jupyter Workspaces
- 412 Distribute
- 120 Domo Everywhere
- 281 Scheduled Reports
- 11 Software Integrations
- 145 Manage
- 141 Governance & Security
- 8 Domo Community Gallery
- 48 Product Releases
- 12 Domo University
- 5.4K Community Forums
- 41 Getting Started
- 31 Community Member Introductions
- 115 Community Announcements
- 4.8K Archive