Workbench Job Runtime
Does anyone have any tips or best practices for improving runtime on workbench jobs? We have a job that updates almost 82 million rows of data daily and takes about 3 hours to run. It varies day to day anywhere from 2 hours to 3 hours. I do not have access to workbench as it is controlled by my IT but I am looking for tips I could share with them to help improve the runtime of this job. The job was created several years ago so I am sure some changes could be done to make it more efficient but I am just looking for some ideas (quick wins or perhaps features of new WB releases) that you all have come across that usually help run time.
**If this answer solved your problem be sure to like it and accept it as a solution!
Answers
-
@Ashleigh - What type of source is the data being pulled from? A database?
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**0 -
@GrantSmith yea I think so, its one of our financial reporting systems.
**If this answer solved your problem be sure to like it and accept it as a solution!
0 -
Typically when dealing with databases you'll want to narrow down your data as soon as possible to limit the amount of data being processed. Also, you'll want to make sure you're utilizing table indexes in your join and where clauses to improve querying response time. If they don't have any the might want to think about adding some but that's a larger discussion with your DBA.
**Was this post helpful? Click Agree or Like below**
**Did this solve your problem? Accept it as a solution!**1 -
@Ashleigh Workbench 5.1 introduced partition support which might be effective for your use case. You can read about it here:
https://domohelp.domo.com/hc/en-us/articles/360062446514-Workbench-5-1-Partition-Support
**Check out my Domo Tips & Tricks Videos
**Make sure to any users posts that helped you.
**Please mark as accepted the ones who solved your issue.2 -
@Ashleigh , @MarkSnodgrass and @GrantSmith are on the money for very different reasons.
1) as your query get's larger, workbench execution times are 99% subject to the performance of your source database (how long does it take to retrieve your data from SQL). If you're parsing 82million rows on your source database you are murdering your system if it's not properly indexed (work with your DBA to optimize your workbench query in the source database via indexes and database stats). make sure you're not doing SELECT * because no DBA likes seeing that ...
2) Mark nailed it on the head. If you can minimize how much data you have to update in Domo (via partitioning or UPSERT instead of a full REPLACE) then you can shrink how much data you have to pull out of your source database.
Jae Wilson
Check out my 🎥 Domo Training YouTube Channel 👨💻
**Say "Thanks" by clicking the ❤️ in the post that helped you.
**Please mark the post that solves your problem by clicking on "Accept as Solution"2
Categories
- All Categories
- 1.8K Product Ideas
- 1.8K Ideas Exchange
- 1.6K Connect
- 1.2K Connectors
- 300 Workbench
- 6 Cloud Amplifier
- 9 Federated
- 2.9K Transform
- 101 SQL DataFlows
- 622 Datasets
- 2.2K Magic ETL
- 3.9K Visualize
- 2.5K Charting
- 747 Beast Mode
- 59 App Studio
- 41 Variables
- 686 Automate
- 176 Apps
- 453 APIs & Domo Developer
- 47 Workflows
- 10 DomoAI
- 36 Predict
- 15 Jupyter Workspaces
- 21 R & Python Tiles
- 396 Distribute
- 113 Domo Everywhere
- 276 Scheduled Reports
- 7 Software Integrations
- 126 Manage
- 123 Governance & Security
- 8 Domo Community Gallery
- 38 Product Releases
- 10 Domo University
- 5.4K Community Forums
- 40 Getting Started
- 30 Community Member Introductions
- 109 Community Announcements
- 4.8K Archive