Comments
-
There isn't a way to programmatically do this but you can utilize the SPLIT_PART function in a formula tile to pull the specific one you want. You'd need to have multiple formula tiles to pull the different number out then stack everything back together with an append rows tile. You'd need to do this for as many different…
-
You can move the cards by going under Admin - Cards and then filtering for a specific page you want to take cards from, select all the ones you want to move and then select to move them from the wrench in the upper right.
-
I believe the Alpha Vantage connector is no longer usable as it's fully paid now and not accessible via Domo. You'd need to get stock information from a different source. You could utilize a Google Sheet and then use the GOOGLEFINANCE function to pull in the associated stock information into the sheet and then pull that…
-
Currently there isn't an option to update the frequency using PyDomo.
-
You can utilize the RANK window function in a beast mode. Something like: RANK() over (PARTITION BY `List` ORDER BY SUM(`All Total`))
-
You'd need to take a daily snapshot of the domo stats to calculate the row count each day and then use those values to graph the row size day over day. You can use a recursive dataflow to handle this or alternatively set the stats dataset to append and use the _BATCH_LAST_RUN_ field as your timestamp date. To get the…
-
You can use the HOUR function to return a value between 0 and 23 to filter on (in your case 9-17). CASE WHEN HOUR(`Timestamp`) BETWEEN 9 AND 17 THEN 'Keep' ELSE 'Exclude' END If you want to convert the times in a 12 hour format you can use the DATE_FORMAT function with a specific format string DATE_FORMAT(`Timestamp`,…
-
You might be able to utilize the Domo ODBC driver. https://domo-support.domo.com/s/article/360043437693?language=en_US
-
If you're speaking to trellis charts that option is still there (this is from a bar chart)
-
You'll want to put in the name of your instance and not your website. Enter your customer name. Your customer name is the part in the URL that before .domo.com. Ex: If your Domo instance is located at https://some-customer.domo.com, then your customer name is some-customer.
-
No, I think it's been out since Dashboards have been out.
-
If you Edit Dashboard, you can select from the Edit Content menu - select Change filter exceptions and uncheck the box that says Allow global date. This will prevent the page data filter from filtering your card.
-
Alternatively you can use can type ``` and a code block will pop up for you.
-
I'd recommend using an ETL for things like this as you already have your original values. You just need to filter them where your value isn't null then join them together based on the 1st column in your example.
-
That's how dates data types are formatted within Domo. If you're wanting to change the format you can use a DATE_FORMAT function in a beast mode to convert it to a string by passing in the appropriate formatting string. See https://www.w3schools.com/mysql/func_mysql_date_format.asp As for calculating the differences…
-
It's because your data is on two separate rows. You'd need to either do it in a beast mode on the card or utilize an ETL to combine the data together so you have both records on the same row.
-
You can use the LEAST function to select the smallest value of dates LEAST(`Observation Date`, `Admit Date`) If the observation date is null then you'll want to default it to the admit date via a COALESCE or IFNULL: LEAST(COALESCE(`Observation Date`, `Admit Date`), `Admit Date`)
-
Looks like you're not passing in the domain with the URL correctly. Without seeing your code I can't tell you exactly but look at where you're specifying which instance / domain you're using to connect to.
-
Yes as it's not in the exclusion list
-
Regular expressions are your friend in this case. You could ignore any non-alphanumeric characters in your string with something like this with a formula tile: REGEXP_REPLACE(`field`, '[^a-zA-Z0-9]', '') [] = Group of possible characters ^ = NOT a-zA-Z any letter 0-9 = any digit
-
Currently virtual datasets aren't included in the governance datasets and there isn't a way to easily get this information.
-
Ideally you'd use a window function to do this but selecting the MAX of a string isn't available in the Rank & Window function in an ETL. Instead use a group by tile and group based off your loan ID and select the MAX of your T/F field (since T is greater than F alphabetically it'll return TRUE if any one of them is true).…
-
You can get a list of the available column types from the pydomo code: https://github.com/domoinc/domo-python-sdk/blob/dacdee87d9e798f4c83f270b38c8ea2b1b6c7923/pydomo/datasets/DataSetModel.py#L16
-
You can pull a custom defined report in the connector setup with Google Analytics and select the fields you wish to pull in. You can select Source and Medium as two separate fields to include.
-
These are untested but something like this should get you what you're looking for: Previous Quarter Start Date LAST_DAY(`Report Date`) - INTERVAL (MOD(MONTH(`Report Date`), 3) + 3) MONTH + INTERVAL 1 DAY Previous Quarter End Date LAST_DAY(`Report Date`) - INTERVAL (MOD(MONTH(`Report Date`), 3)) MONTH Previous Two Quarter…
-
Are you wanting just a simple number or are you wanting to graph this over time? If you're wanting a simple number: Topics needed per month: (MAX(`Goal`) - SUM(`Topics`)) / (12 - MONTH(CURDATE()) + 1) -- +1 to include the current month Average amount of topics prior 10 months: CASE WHEN MONTH(CURDATE()) <> 1 THEN --…
-
Here's a more detailed writeup of how to configure a date dimension dataset which will go over how to set up everything and allow you to do simple YoY or PoP % calculations. https://dojo.domo.com/main/discussion/53481/a-more-flexible-way-to-do-period-over-period-comparisons#latest
-
The MySQL dataflow runs slower because it must transfer 16MM rows to the MySQL server process it and then transfer the data back. Queuing via the api doesn’t need to transfer the data across systems so it’ll be much faster
-
You can use a recursive dataflow and use a formula tile to calculate a TIMESTAMP on when the ETL runs using CURDATE() as the formula Ideally if you can calculate a run TIMESTAMP within your input dataset it would be more accurate as to when the data was actually pulled
-
You’d need to calculate it in a beast mode and then use that value in the text
