Comments
-
Dimension is how you want your totals to be bucketed. If you want it across your entire dataset you can remove the by Dimension portion of the fixed function.
-
under chart types section you can select filter charts and there you’ll find the sliver chart type
-
it should be under Subtotal rows in the chart options to the left of the chart where you can enable the subtotals
-
CONVERT_TZ CONVERT_TZ() converts the datetime value dt to a new moment in time such that the original value''s wall-clock time when rendered in from_tz matches the new value''s wall-clock time when rendered in to_tz. CONVERT_TZ(dt,from_tz,to_tz) Alternatively you could set the timezone of your data in the dataset…
-
you can use SPLIT_PART (Beast Mode or Magic ETL formula tile) or REGEXP_REPLACE (Magic ETL formula tile only) SPLIT_PART(`url`, '/', 3) or REGEXP_REPLACE(`url`, '^(http://)?([^/]+/){2}([^/]+).*$', '$3')
-
When Domo imports data, it's assumed to be within UTC, then when the data is visualized, it's translated to the timezone configured in your instance company settings. You can leverage an ETL with a formula tile and a CONVERT_TZ function to change the timezone of a timestamp if necessary.
-
A list of dictionaries is how the Domo Data API returns the data so you should be able to get that format from the API.
-
You can leverage Domo's built in APIs to get data out of the system. You can reference . Is there a specific format you need besides an array of rows that Domo's Data API provides?
-
I ended up writing my own but if you search for Mapbox Brick in the App Store it'll have 5 different options for you to start from. Here's one: https://[INSTANCE].domo.com/appstore/details/06CF6B893D7412955F4FC0E8F6A0D9107944E493460B256982479760677000B1?origin=search
-
there are several mapping JavaScript libraries you can leverage inside of a Domo Brick. If you search the App Store for Domo Brick and map box there are ones you can modify. I’ve leverage map box as it allowed me to upload my geojson dataset to mapbos to store and then reference and display it within the brick.
-
I'd recommend restructuring your data with a custom date offset so you can easily calculate the prior week. I've done a write up on it here:
-
You could take your entire dataset, set the series column value to 'Grand Total' and then stack that with your original dataset so you then have a Grand Total value in your series to get the format you're looking for.
-
This sounds like a bug. I'd recommend logging a ticket with Domo Support for their team to investigate.
-
It’s because you have an aggregate inside an aggregate which isn’t allowed. You could attempt to use a window function so the max is returned for each row depending on your partitions and then do the count on that value COUNT(CASE WHEN `campaign_id` > MAX(`latest_unsent_campaign_acct`) FIXED (BY…
-
can you calculate the date yourself based on other data in each row? If so you can use an ETL and a formula tile to calculate it. If not you'll need to reimport your data
-
it’s because you have an aggregate inside an aggregate which isn’t allowed. You could attempt to use a window function so the max is returned for each row depending on your partitions and then do the count on that value COUNT(CASE WHEN `campaign_id` > MAX(`latest_unsent_campaign_acct`) FIXED (BY…
-
Custom apps don't support dynamic dataset changing. Have you thought about combining your datasets together via a dataset view or dataflow into a single dataset and use that output dataset to power your App?
-
Because you have a $ in the data it's treating it as a string and can't do any mathematical operations on the data. Remove the $ from your data either in the source itself or using an ETL and convert it to a decimal number. You can then format this number as a currency in the card itself. This will allow you to do…
-
is your data formatted at just numbers and decimals or do you have the $ in your raw data?
-
I've outlined an alternative method here: You'd need to use 0 instead of NULL when calculating your column field.
-
You can join your original dataset to the new dataset based on your ID. Then use a formula tile to see where the status fields differ and then populate the updated date using CURRENT_TIMESTAMP() function.
-
You could implement your own visualization utilize a Domo Brick some Javascript packages. Alternatively have you thought about utilizing drill paths to narrow down your tables?
-
Hi @SarahR365 I'd recommend starting with as it outlines the Sandbox process and how you can promote connectors and share with users from your current dev instance to your new prod instance.
-
what you’re referring to is the special lastvalue parameter in a workbench query. Some basic information can be found here as an example you can ignore the fact that this is OLAP Cube documentation but just search for lastvalue to see the documentation on it. This way you can track the latest id you’ve pulled in and have…
-
what is your setting for the regex body? Are you putting anything in the body when sending the email? Does it work if you put in NONE for the regex body setting?
-
You can attempt to do your DATE_FORMAT call but encompass it inside of an IFERROR call to filter out the rows where you're note able to convert to a date with your format. IFERROR(EXPR1,EXPR2) If the evaluation of expr1 does not produce an error, IFERROR() returns expr1; otherwise it returns expr2.
-
@Sean_Tully - I've used that method as well to convert a string to an integer. I've found it works well.
-
The DomoStats connector offers a Workbench report you can pull which has a lot of the job metadata assocaited with it.
-
CASE WHEN `ItemName` LIKE '%A%' OR `ItemName` LIKE '%B%' OR `ItemName` LIKE '%C%' OR `ItemName` LIKE '%P%' THEN 'Ignore' ELSE 'Keep' END You can use the LIKE statement to see if your string contains other characters. Alternatively you can use a REGEXP_REPLACE in Magic ETL formula tile: CASE WHEN REGEXP_LIKE(`ItemName`,…
-
You'd need to write the changes to your dataset back to Google Sheets so the orginal source is updated. You can leverage a Google Writeback connector within a Dataflow.