Comments
-
Correct, Magic ETL has network restrictions with the exception of Writeback tiles. You'd need to import your data using a connector first and then process it with Magic ETL
-
You'll need to reach out to Domo Support to have them possibly remove these.
-
@RocketMichael If those are your only records in your dataset you can utilize a Magic ETL to add a constant to your dataset (Join Column, value 1), then filter your data for just 'Current Week'. Take that filter and join it back to your original dataset with the constant and join both on the constant. This will join each…
-
How is your rank value being calculated in your dynamicRank table?
-
Currently it's not supported with a Date type variable. It'd be a great idea for the idea exchange.
-
Are you using a beast mode or an ETL to calculate your rank? Could you post the code?
-
Should be DAYOFMONTH
-
`Date` - INTERVAL (DATOFMONTH(`Date`) - 1) DAY
-
Have you tried using an unpivot in an Etl to convert your columns to rows then group by your metric column and count the number of values?
-
When you’re filtering and including a and b your lag function will pull data from metric a and b. You’ll want to add PARTITION BY ` metric ` to your lag statements to make sure you’re not jumping metrics with the lag
-
If you go to the bottom of the post there's an updated version (under Addendum) of JSON code you can copy and then paste into the ETL canvas area to automatically populate the necessary tiles and logic. You'll just need to name the output dataset and select the correct datasets for the input datasets. There are two…
-
You could use a Magic ETL 2.0 dataflow and group your data on the start_date, model and inventory_units and select the MAX ob_modified_date then do an inner join from your original input dataset and the output of the group by based on the start_date, model, inventory_units and ob_modified_date.
-
You can use the aggregation on your value field in the Single value card and set it to maximum:
-
pydomo/ds_update doesn't support append at this time. You could configure a recursive dataflow to append your data.
-
You can utilize the Java CLI and the backup-card command to get the card definition. You can also monitor the API it's calling with the log-api command before you run backup-card to get the api endpoints it's calling.
-
Hi @damen I'd recommend utilizing your own date dimension table to compare the same days per year. I've done a writeup on how to do this previously here: https://dojo.domo.com/main/discussion/53481/a-more-flexible-way-to-do-period-over-period-comparisons#latest
-
If you browse to the actual dataset you can see the specific ID in the URL in a GUID format. https://instance.domo.com/datasources/a12cbf64-35d0-47bb-8567-ce7c87149a54/details/overview
-
I’d recommend utilizing a date dimension dataset with different offsets so you have more control of your PoP values and it will allow you to have multiple types of periods o. The same graph. I’ve done a write up on how to do this is the past here:…
-
No, if you can change the names so they're the same in the backed then you won't need to create a new field / beast mode to do your filtering.
-
Count is counting the number of non null values. You’ll want to return a value only if you want it counted. You can use a CASE statement in your COUNT to do this COUNT(CASE WHEN `ptstatus`=80 THEN `ptstatus` END)
-
COUNT function will count the number of non null records if you’re wanting to compare to not NULL then it’s the same as SQL CASE WHEN `Name` IS NOT NULL THEN 'Not Null' ELSE 'Null' END
-
If you have dataset A with Field1 and Dataset B with Field2 then create a beast mode on the card using dataset B and call it Field1. Then have it return the value Field2 `Field2`
-
You’ll need to rename them so they have the same name. You could just save a beast mode to the dataset where you have it just return the other column
-
Your Calendly date field is being store in millisecond unix timestamp. You can convert it to seconds by dividing by 1000 and then using FROM_UNIXTIME to convert it to an actual datetime type FROM_UNIXTIMESTAMP(`First Booked Calendly Date`/1000)
-
You can use a group by tile to select the MAX Date grouping on your ID then do an inner join where the Date and ID match to get the most recent records in your ETL.
-
I'd recommend reaching out to Domo Support as this appears to be a bug with the CLI tool.
-
You might be able to get a user list from https://SiteURL/_catalogs/users/detail.aspx but you need to be an admin to access it. I found this on the internet which may provide more information: https://www.c-sharpcorner.com/blogs/sharepoint-rest-api-get-user-properties-and-user-information-list
-
You could have two datasets, one to pull in 31 days ago (set to append) and another to pull in the last 30 days (set to replace) and then use an ETL to append both of the datasets together.
-
Are you referencing creation of these on developer.domo.com? These don't have the Domo Grants applied to them. To use an API account the account would need to be either created by the user so they're the owner or have it shared with them. If they want to create a new account and one doesn't exist they'd need the Create…
-
Currently this isn't possible however I'd recommend adding this to the idea exchange to allow more flexibility with drill paths