Comments
-
If you go to the bottom of the post there's an updated version (under Addendum) of JSON code you can copy and then paste into the ETL canvas area to automatically populate the necessary tiles and logic. You'll just need to name the output dataset and select the correct datasets for the input datasets. There are two…
-
You could use a Magic ETL 2.0 dataflow and group your data on the start_date, model and inventory_units and select the MAX ob_modified_date then do an inner join from your original input dataset and the output of the group by based on the start_date, model, inventory_units and ob_modified_date.
-
You can use the aggregation on your value field in the Single value card and set it to maximum:
-
pydomo/ds_update doesn't support append at this time. You could configure a recursive dataflow to append your data.
-
You can utilize the Java CLI and the backup-card command to get the card definition. You can also monitor the API it's calling with the log-api command before you run backup-card to get the api endpoints it's calling.
-
Hi @damen I'd recommend utilizing your own date dimension table to compare the same days per year. I've done a writeup on how to do this previously here: https://dojo.domo.com/main/discussion/53481/a-more-flexible-way-to-do-period-over-period-comparisons#latest
-
If you browse to the actual dataset you can see the specific ID in the URL in a GUID format. https://instance.domo.com/datasources/a12cbf64-35d0-47bb-8567-ce7c87149a54/details/overview
-
I’d recommend utilizing a date dimension dataset with different offsets so you have more control of your PoP values and it will allow you to have multiple types of periods o. The same graph. I’ve done a write up on how to do this is the past here:…
-
No, if you can change the names so they're the same in the backed then you won't need to create a new field / beast mode to do your filtering.
-
Count is counting the number of non null values. You’ll want to return a value only if you want it counted. You can use a CASE statement in your COUNT to do this COUNT(CASE WHEN `ptstatus`=80 THEN `ptstatus` END)
-
COUNT function will count the number of non null records if you’re wanting to compare to not NULL then it’s the same as SQL CASE WHEN `Name` IS NOT NULL THEN 'Not Null' ELSE 'Null' END
-
If you have dataset A with Field1 and Dataset B with Field2 then create a beast mode on the card using dataset B and call it Field1. Then have it return the value Field2 `Field2`
-
You’ll need to rename them so they have the same name. You could just save a beast mode to the dataset where you have it just return the other column
-
Your Calendly date field is being store in millisecond unix timestamp. You can convert it to seconds by dividing by 1000 and then using FROM_UNIXTIME to convert it to an actual datetime type FROM_UNIXTIMESTAMP(`First Booked Calendly Date`/1000)
-
You can use a group by tile to select the MAX Date grouping on your ID then do an inner join where the Date and ID match to get the most recent records in your ETL.
-
I'd recommend reaching out to Domo Support as this appears to be a bug with the CLI tool.
-
You might be able to get a user list from https://SiteURL/_catalogs/users/detail.aspx but you need to be an admin to access it. I found this on the internet which may provide more information: https://www.c-sharpcorner.com/blogs/sharepoint-rest-api-get-user-properties-and-user-information-list
-
You could have two datasets, one to pull in 31 days ago (set to append) and another to pull in the last 30 days (set to replace) and then use an ETL to append both of the datasets together.
-
Are you referencing creation of these on developer.domo.com? These don't have the Domo Grants applied to them. To use an API account the account would need to be either created by the user so they're the owner or have it shared with them. If they want to create a new account and one doesn't exist they'd need the Create…
-
Currently this isn't possible however I'd recommend adding this to the idea exchange to allow more flexibility with drill paths
-
You might be able to leverage pfilters to pass in your filters in your URL to recreate the filtered view you're looking at. https://domo-support.domo.com/s/article/360042933114?language=en_US
-
Try this: REGEXP_REPLACE(`string`, '^.*(\d{2}-[A-z]{3}-\d{4}).*$', '$1') You need to replace the entire string not just the substring. This is why there's ^.* and .*$ to match anything before and after the date value respectively.
-
You could also utilize a regular expression to modify your string and automatically insert the HTML code. Something like this might work assuming your values are actual colors (untested) REGEXP_REPLACE(`RAG History`, '.*((Yellow)|(Red)|(Green)).*', '<div><span style="color: $1">$1</span></div>')
-
You could utilize an HTML table and utilize HTML code to colorize each value in your string. You'd need to CONCAT the HTML code together with your values and conditionally set the color. I've done a previous writeup on this here:…
-
Since you can't do conditional joins within Magic ETL 2.0 what you can do is add a constant to both datasets in your ETL and call it Join Column with a value of 1. Then do a join on both datasets on the join column. Then feed that into a filter tile to filter where company_holiday_date is BETWEEN your date_start and…
-
You're not able to see what API script was called to import the data into Domo.
-
You can update the schema (specifically the dataset name) using the update method and updating the schema. There's an example in pydomo: https://github.com/domoinc/domo-python-sdk/blob/dacdee87d9e798f4c83f270b38c8ea2b1b6c7923/examples/dataset.py#L32
-
For another option: STR_TO_DATE(`y` + 1, '%Y') - 1 '%Y' - defaults to the first of the year +1 Add a year -1 Subtracts a day y here is the year in integer format. In other words, it's converting the given year to be the first day of the next year then subtracting 1 day to get the last day of the given year
-
You could attempt to utilize a MySQL dataflow which would give you a bit more flexibility in this case but will execute slower. Here's an example I found which you could use as a template: https://stackoverflow.com/questions/5041537/mysql-csv-row-to-multiple-rows
-
This appears to be a bug. I'd recommend reaching out to support and submit your issue.
