Best Of
Re: Any way to view schedule for dataflows?
import os
import requests
from getpass import getpass
import json
import sys DOMO_INSTANCE = os.getenv("DOMO_INSTANCE", "YOUR_INSTANCE") # <-- set this or export DOMO_INSTANCE
DATAFLOW_ID = "PUT_YOUR_DATAFLOW_ID_HERE" # <-- Replace with a real dataflow id (uuid-ish string)
# Token: either set DOMO_DEV_TOKEN env var or leave empty to be prompted securely at runtime
DOMO_DEV_TOKEN = os.getenv("DOMO_DEV_TOKEN", None) def list_dataflow_ids(instance, dev_token):
url = f"https://{instance}.domo.com/api/dataprocessing/v1/dataflows"
headers = {"x-domo-developer-token": dev_token, "Accept": "application/json"}
resp = requests.get(url, headers=headers)
resp.raise_for_status()
batch = resp.json()
ids = [df["id"] for df in batch if "id" in df]
return ids
def fetch_dataflow_preview(instance, df_id, dev_token):
url = f"https://{instance}.domo.com/api/dataprocessing/v2/dataflows/{df_id}"
headers = {"x-domo-developer-token": dev_token, "Accept": "application/json"}
params = {"validationType": "PREVIEW"}
resp = requests.get(url, headers=headers, params=params)
resp.raise_for_status()
return resp.json()
# -----------------------------
# Main
# -----------------------------
dataflow_ids = list_dataflow_ids(DOMO_INSTANCE, DOMO_DEV_TOKEN)
print(f"Found {len(dataflow_ids)} dataflow IDs\n")
sys.stdout.flush()
dataflows = []
for idx, df_id in enumerate(dataflow_ids, start=1):
try:
preview = fetch_dataflow_preview(DOMO_INSTANCE, df_id, DOMO_DEV_TOKEN)
name = preview.get("displayName") or preview.get("name") or "<no name>"
dataflows.append({"id": df_id, "name": name})
print(f"[{idx}/{len(dataflow_ids)}] {df_id} -> {name}")
sys.stdout.flush() # force print in Jupyter as it runs
except Exception as e:
print(f"[{idx}/{len(dataflow_ids)}] Error fetching preview for {df_id}: {e}")
sys.stdout.flush()
This Pythong code lists out dataflow ids.
Re: Execution List Impossible To Investigate Issues
This sounds like a feature request rather than a question. Is the following (below) a correct summation? If so, maybe a forum admin could move it to
Feature Request: Improve Filtering and Visibility in Execution List
Summary
Currently, the Execution List in Domo does not allow filtering or searching by recipient (e.g., email address or person). This makes it very difficult to investigate issues with automated processes that send emails or require responses.
Use Case
- We run an automated process that sends weekly emails requesting form responses.
- We have 300+ recipients at a time.
- When investigating user-reported issues (such as someone receiving a reminder despite having completed the form), we cannot:
- Search/filter executions by recipient or variable (e.g., email).
- See at a glance who received an email without drilling into each execution.
Problem
- A recipient received reminder emails despite evidence they had already completed the form.
- The form was no longer in the queue, but we could not determine which execution sent the email because the Execution List has no filtering options.
- Troubleshooting requires manually opening each execution, which is inefficient and impractical at scale.
Requested Enhancement
Please add the following capabilities to the Execution List:
- Filtering/searching by recipient (person or email address).
- Filtering by variable values (e.g., user ID, form name, workflow inputs).
- Quick visibility into recipients of each execution without needing to click into each one.
Is there a way to set variables to an embedded page like passing pfilters?
The idea behind this would be to render the iframe of the page with a variable value already set by the src url, like passing a pfilter, so when the page loads, it is loaded with the variable with the passed value.
Re: App Studio Form Output for Multi-select
I checked and the max number of selections so far has been 80.
I did add a pre-processing step in the morning that handles all old entries. This allows my main ETL to only need to import form entries from today. ETL time has dropped from 40-45 seconds to 18-20.
This is better, but still not as performant as using Views. Using Views has the added benefit of Syncing differently and allowing users to maintain their filters.
Re: Wrong alert
Thanks @ColemenWilson and @ArborRose.
we have opened the support case, and turned out it does not work with View Datasets. So changed to dataflow output datasets and it works. It mentioned in documentation, but we overlooked.
Thanks
Re: Using Query Result as Email Body
Thanks for the suggestions, vtiwari!
The Workflow looks pretty different now, but here are the biggest fixes that helped me:
- Moved the loop. The conditional logic needed to be before Filter By User (getObjectFromList) to correctly get one object from the list of objects.
- Added Child Properties to individualUsage that matched lastMonth. The SQL statement pulls a list of objects (a bunch of rows; lastMonth), then the filter pulls one object (one row; individualUsage) at a time. I added Child Properties to individualUsage that exactly matched the names of the Child Properties of lastMonth. This allowed the data to properly map to individualUsage and show up in the bodies of each email.
- (Optional) Removed Create Index. I created a variable called Index and set the initial value to 0 so that the Create Index tile could be removed.
Re: Splitting Appointments
Here's a break down of the way I was thinking. Apologies my docking unit just died and I have no monitors. I'm not used to using a laptop screen…lol..
AppointmentID,AppointmentDate,Provider,Metric
1,2024-06-15,Mitch,1
2,2024-07-20,Justin,1
3,2024-08-05,Kristin,1
4,2025-01-10,Kristin,1
5,2025-02-22,Kristin,1
6,2025-03-15,Justin,1
7,2025-04-12,Mitch,1
8,2025-05-30,Kristin,1
Provider,Galley,SplitPct,StartDate,EndDate
Mitch,John,1,1900-01-01,9999-12-31
Justin,John,1,1900-01-01,9999-12-31
Kristin,Tanner,1,1900-01-01,2024-12-31
Kristin,Tanner,0.5,2025-01-01,9999-12-31
Kristin,Brent,0.5,2025-01-01,9999-12-31
Using a Magic ETL to join the mapping to the appointments over time and applying a formula to adjust the split (sorry my naming is not good). You would come out with
which is similar to Colemen's comments.
Re: Splitting Appointments
Hey @Data_Devon congrats again on the MajorDomo cert!
How I would approach this is to add a 3rd table or some logic to your data. The 3rd table would be an allocation table to would maintain split allocation, effective dates, etc… You could then join this table to have fields that would tell you:
1. If there is a split for this Provider
2. If the appointment falls within the split effective period
3. What the split allocation is
You could alternatively accomplish this through logic in a formula tile in your ETL, but this may be a pain to maintain. The benefit of the Allocation table is that it could be maintained by a sales ops or other operations person at your company AND you'd have a detailed source of truth and history of allocations.
Your data might end up looking something like this:
Appointment | Provider | Galley | Split? | Split Allocation |
|---|---|---|---|---|
Eval | Mitch | John | FALSE | 1 |
Eval | Justin | John | FALSE | 1 |
Eval | Kristin | Tanner | TRUE | 0.5 |
Eval | Kristin | Tanner | TRUE | 0.5 |
Eval | Kristin | Brent | TRUE | 0.5 |
Eval | Kristin | Brent | TRUE | 0.5 |
Re: How to periodically REPLACE in addition to APPEND jobs?
@JunkDoom have a close look at Partitioning.
Many Domo connectors are starting to support partition schemes based on Date. If your data is not partitioned by Date you can use the CLI to create your own partition tags as you upload data.
the in-Domo ETL version of "Partitioning" can be recreated with Recursive Dataflows. https://domohelp.domo.com/hc/en-us/articles/360057087393-Creating-a-Recursive-Snapshot-DataFlow-in-Magic-ETL
There is a beta (or is it GA @GrantSmith ) for Magic ETL + Partitions.
You want to be careful mixing Partition with Replace. Replace will Replace your entire dataset WITHOUT partition_tags. meaning that if you tried to mix Replace with Append+Partition you would duplicate data.




