Comments
-
Likely the case is that this package isn't installed in the default Domo Jupyter instance. You'd need to reach out to Domo Support to see if they can configure a custom environment with this package.
-
Alternatively, you can also use a Transform on your workbench job to transition your dates into a specific format/timezone.
-
When I'm doing period over period (WoW in your case) analysis I like to restrucutre my data so that each day has a relative entry in the dataset. This will allow you to go back in time and see what the capacity and activity levels were no matter the date. I've done a write up in the past on how to do this here:
-
If you're pre-aggregating your data in the Magic ETL and you just want to remove the duplicate records after aggregation you can use the Remove Duplicates tile keying off of the date, ID and average
-
You can filter based on the max closed date CASE WHEN MAX(`closed_date`) FIXED (BY client_id) < CURRENT_DATE() - INTERVAL 2 YEAR THEN 'Stale' ELSE 'Active' END this will find the max date for each client and allow you to filter for stale values
-
Dimension is how you want your totals to be bucketed. If you want it across your entire dataset you can remove the by Dimension portion of the fixed function.
-
under chart types section you can select filter charts and there you’ll find the sliver chart type
-
it should be under Subtotal rows in the chart options to the left of the chart where you can enable the subtotals
-
CONVERT_TZ CONVERT_TZ() converts the datetime value dt to a new moment in time such that the original value''s wall-clock time when rendered in from_tz matches the new value''s wall-clock time when rendered in to_tz. CONVERT_TZ(dt,from_tz,to_tz) Alternatively you could set the timezone of your data in the dataset…
-
you can use SPLIT_PART (Beast Mode or Magic ETL formula tile) or REGEXP_REPLACE (Magic ETL formula tile only) SPLIT_PART(`url`, '/', 3) or REGEXP_REPLACE(`url`, '^(http://)?([^/]+/){2}([^/]+).*$', '$3')
-
When Domo imports data, it's assumed to be within UTC, then when the data is visualized, it's translated to the timezone configured in your instance company settings. You can leverage an ETL with a formula tile and a CONVERT_TZ function to change the timezone of a timestamp if necessary.
-
A list of dictionaries is how the Domo Data API returns the data so you should be able to get that format from the API.
-
You can leverage Domo's built in APIs to get data out of the system. You can reference . Is there a specific format you need besides an array of rows that Domo's Data API provides?
-
I ended up writing my own but if you search for Mapbox Brick in the App Store it'll have 5 different options for you to start from. Here's one: https://[INSTANCE].domo.com/appstore/details/06CF6B893D7412955F4FC0E8F6A0D9107944E493460B256982479760677000B1?origin=search
-
there are several mapping JavaScript libraries you can leverage inside of a Domo Brick. If you search the App Store for Domo Brick and map box there are ones you can modify. I’ve leverage map box as it allowed me to upload my geojson dataset to mapbos to store and then reference and display it within the brick.
-
I'd recommend restructuring your data with a custom date offset so you can easily calculate the prior week. I've done a write up on it here:
-
You could take your entire dataset, set the series column value to 'Grand Total' and then stack that with your original dataset so you then have a Grand Total value in your series to get the format you're looking for.
-
This sounds like a bug. I'd recommend logging a ticket with Domo Support for their team to investigate.
-
It’s because you have an aggregate inside an aggregate which isn’t allowed. You could attempt to use a window function so the max is returned for each row depending on your partitions and then do the count on that value COUNT(CASE WHEN `campaign_id` > MAX(`latest_unsent_campaign_acct`) FIXED (BY…
-
can you calculate the date yourself based on other data in each row? If so you can use an ETL and a formula tile to calculate it. If not you'll need to reimport your data
-
it’s because you have an aggregate inside an aggregate which isn’t allowed. You could attempt to use a window function so the max is returned for each row depending on your partitions and then do the count on that value COUNT(CASE WHEN `campaign_id` > MAX(`latest_unsent_campaign_acct`) FIXED (BY…
-
Custom apps don't support dynamic dataset changing. Have you thought about combining your datasets together via a dataset view or dataflow into a single dataset and use that output dataset to power your App?
-
Because you have a $ in the data it's treating it as a string and can't do any mathematical operations on the data. Remove the $ from your data either in the source itself or using an ETL and convert it to a decimal number. You can then format this number as a currency in the card itself. This will allow you to do…
-
is your data formatted at just numbers and decimals or do you have the $ in your raw data?
-
I've outlined an alternative method here: You'd need to use 0 instead of NULL when calculating your column field.
-
You can join your original dataset to the new dataset based on your ID. Then use a formula tile to see where the status fields differ and then populate the updated date using CURRENT_TIMESTAMP() function.
-
You could implement your own visualization utilize a Domo Brick some Javascript packages. Alternatively have you thought about utilizing drill paths to narrow down your tables?
-
Hi @SarahR365 I'd recommend starting with as it outlines the Sandbox process and how you can promote connectors and share with users from your current dev instance to your new prod instance.
-
what you’re referring to is the special lastvalue parameter in a workbench query. Some basic information can be found here as an example you can ignore the fact that this is OLAP Cube documentation but just search for lastvalue to see the documentation on it. This way you can track the latest id you’ve pulled in and have…
-
what is your setting for the regex body? Are you putting anything in the body when sending the email? Does it work if you put in NONE for the regex body setting?