Optimizing a Domo Workflow for Sending Emails Based on Dataset Markers
I’m working on a project that involves sending emails to individuals based on specific markers in a dataset. Currently, I’ve set up a Domo workflow that iterates through the dataset row by row, checks certain conditions, and sends emails accordingly. While the workflow functions as intended, it is extremely slow.
To address this, I’ve tried cleaning up the dataset in an ETL to filter and prepare the data before the workflow starts. However, even with a pre-filtered dataset, the workflow still needs to iterate through each row to determine who to send the emails to and to personalize the content. This row-by-row processing seems to be the main bottleneck, and I’m looking for ways to optimize this process.
I’m curious to know how others in the community might approach this kind of project. Are there best practices for minimizing iteration time in workflows? Have you had success with bulk email sending or batching methods within Domo? Would integrating external tools or APIs to handle email sending, while still leveraging Domo datasets, be a better solution?
If you’ve tackled a similar problem or have any ideas, I’d greatly appreciate your insights. Thank you in advance for your help!
Best Answer
-
My first thought, is why not adjust the query so it only returns rows that you know you need to action? You can add a WHERE clause to the query.
SELECT * FROM DATA WHERE ` Number_Column ` = 1Then you don't need to process through everyone and check to see what the value is - you already know the value. You can also remove the "Get Number from Object" block by defining the existing object, and evaluating it's child. You can edit the "current_row" (or whatever you have it called) object to say it includes "number_column" (or whatever the column is) and then evaluate current_row.number_column. Eliminating that step will make it not only faster, but also less costly.
0
Answers
-
You could do things concurrently…so have one workflow that identifies all the rows, this kicks off a second workflow that does the sending of the email.
If the first workflow identifies 100 emails to send, it will start 100 iterations of the step 2 workflow that all run concurrently.
That said, none of this should take very long. What type of processing time are you seeing, and how many rows are you processing/emails are you sending?0 -
Hey Dan,
This is my first workflow, so I’m not entirely sure if I’m taking the right approach, but here’s how it’s currently set up:
Right now, the workflow isn’t configured to send emails. Its primary function at this stage is to loop through the dataset, evaluate a specific column’s value, and increment counters. This allows me to verify that I’m obtaining the correct results before enabling the email functionality.
Currently, the workflow processes about 430 rows of data in 12 minutes. However, the number of rows should drop significantly once the workflow is live and people start acting on the data.
Workflow "Pseudocode":
- Start workflow.
- Initialize variables (index & 4 counters).
- Query dataset with SQL (dataset is an ETL output).
- SELECT * FROM data
- Returns: array of objects (each row is an object).
- Loop through rows in the array:
- Get the current object/"row."
- Check the numerical value in one of the "columns" using the "Get Number from Object" block.
- Increment a counter based on the result.
- (Future step: Replace counters with "send email" blocks.)
- Check for more rows:
- If more rows exist (based on the index variable), repeat.
- If none, end the workflow.
I’d really appreciate any feedback on whether this approach makes sense or if there’s a better way to accomplish this. Let me know if you need more details!
0 -
So it sounds like what you want to do is send an email if a numerical value meets certain criteria? Is that correct?
And no worries at all! If this is your first workflow you're doing great! There's a definite learning curve.0 -
My first thought, is why not adjust the query so it only returns rows that you know you need to action? You can add a WHERE clause to the query.
SELECT * FROM DATA WHERE ` Number_Column ` = 1Then you don't need to process through everyone and check to see what the value is - you already know the value. You can also remove the "Get Number from Object" block by defining the existing object, and evaluating it's child. You can edit the "current_row" (or whatever you have it called) object to say it includes "number_column" (or whatever the column is) and then evaluate current_row.number_column. Eliminating that step will make it not only faster, but also less costly.
0 -
Hey Dan!
That is exactly what I started implementing after your response yesterday. I am utilizing the "SQL Query Table" function to grab each row, then iterating through each row in 3 separate/parallel loops, which is a lot faster!
Thank you! :)
1
Categories
- All Categories
- 1.8K Product Ideas
- 1.8K Ideas Exchange
- 1.6K Connect
- 1.3K Connectors
- 302 Workbench
- 6 Cloud Amplifier
- 9 Federated
- 2.9K Transform
- 104 SQL DataFlows
- 635 Datasets
- 2.2K Magic ETL
- 3.9K Visualize
- 2.5K Charting
- 760 Beast Mode
- 62 App Studio
- 42 Variables
- 700 Automate
- 182 Apps
- 457 APIs & Domo Developer
- 51 Workflows
- 10 DomoAI
- 38 Predict
- 16 Jupyter Workspaces
- 22 R & Python Tiles
- 401 Distribute
- 116 Domo Everywhere
- 277 Scheduled Reports
- 8 Software Integrations
- 130 Manage
- 127 Governance & Security
- 8 Domo Community Gallery
- 38 Product Releases
- 12 Domo University
- 5.4K Community Forums
- 40 Getting Started
- 30 Community Member Introductions
- 111 Community Announcements
- 4.8K Archive