コメント
-
Have you tried re-authenticating the account to approve the scopes necessary for the connection?
-
OKTA's System Log API keeps data around for 90 days (source). It should be possible to get data for the last 30 days. Since Domo provides the API connector they'd need to be the ones to change the code to pull 30 days instead of 7. As Colemen mention you can log an idea in the idea exchange or you can possibly log a…
-
Ah, since you're displaying dates in your x-axis and you don't want the dates to be included in your amount you can use the REMOVE clause in your fixed function:
-
You could use a beast mode to display the selected state if only one is selected: CASE WHEN COUNT(DISTINCT `State`) = 1 THEN MAX(`State`) ELSE '' END
-
have you tried using your fixed function but instead of using by date use filter deny date and have no by clause
-
When I'm doing period over period type analysis I'll restructure the data so that I can reference each period type for each day. Then use Beast Modes to calculate the differences and the % changes. I've done a write up on this in the past:
-
Are your transactions duplicated? If not you could do a conditional sum like: SUM(CASE WHEN `transaction_type` = 'Quantity' THEN `quantity` ELSE 1 END)
-
Can you connect to the SFTP server using another tool besides Domo? If you can I'd recommend reaching out to Domo Support
-
Currently this isn't possible however may be able to do this sort of functionality within a Domo Brick
-
Likely the case is that this package isn't installed in the default Domo Jupyter instance. You'd need to reach out to Domo Support to see if they can configure a custom environment with this package.
-
Alternatively, you can also use a Transform on your workbench job to transition your dates into a specific format/timezone.
-
When I'm doing period over period (WoW in your case) analysis I like to restrucutre my data so that each day has a relative entry in the dataset. This will allow you to go back in time and see what the capacity and activity levels were no matter the date. I've done a write up in the past on how to do this here:
-
If you're pre-aggregating your data in the Magic ETL and you just want to remove the duplicate records after aggregation you can use the Remove Duplicates tile keying off of the date, ID and average
-
You can filter based on the max closed date CASE WHEN MAX(`closed_date`) FIXED (BY client_id) < CURRENT_DATE() - INTERVAL 2 YEAR THEN 'Stale' ELSE 'Active' END this will find the max date for each client and allow you to filter for stale values
-
Dimension is how you want your totals to be bucketed. If you want it across your entire dataset you can remove the by Dimension portion of the fixed function.
-
under chart types section you can select filter charts and there you’ll find the sliver chart type
-
it should be under Subtotal rows in the chart options to the left of the chart where you can enable the subtotals
-
CONVERT_TZ CONVERT_TZ() converts the datetime value dt to a new moment in time such that the original value''s wall-clock time when rendered in from_tz matches the new value''s wall-clock time when rendered in to_tz. CONVERT_TZ(dt,from_tz,to_tz) Alternatively you could set the timezone of your data in the dataset…
-
you can use SPLIT_PART (Beast Mode or Magic ETL formula tile) or REGEXP_REPLACE (Magic ETL formula tile only) SPLIT_PART(`url`, '/', 3) or REGEXP_REPLACE(`url`, '^(http://)?([^/]+/){2}([^/]+).*$', '$3')
-
When Domo imports data, it's assumed to be within UTC, then when the data is visualized, it's translated to the timezone configured in your instance company settings. You can leverage an ETL with a formula tile and a CONVERT_TZ function to change the timezone of a timestamp if necessary.
-
A list of dictionaries is how the Domo Data API returns the data so you should be able to get that format from the API.
-
You can leverage Domo's built in APIs to get data out of the system. You can reference . Is there a specific format you need besides an array of rows that Domo's Data API provides?
-
I ended up writing my own but if you search for Mapbox Brick in the App Store it'll have 5 different options for you to start from. Here's one: https://[INSTANCE].domo.com/appstore/details/06CF6B893D7412955F4FC0E8F6A0D9107944E493460B256982479760677000B1?origin=search
-
there are several mapping JavaScript libraries you can leverage inside of a Domo Brick. If you search the App Store for Domo Brick and map box there are ones you can modify. I’ve leverage map box as it allowed me to upload my geojson dataset to mapbos to store and then reference and display it within the brick.
-
I'd recommend restructuring your data with a custom date offset so you can easily calculate the prior week. I've done a write up on it here:
-
You could take your entire dataset, set the series column value to 'Grand Total' and then stack that with your original dataset so you then have a Grand Total value in your series to get the format you're looking for.
-
This sounds like a bug. I'd recommend logging a ticket with Domo Support for their team to investigate.
-
It’s because you have an aggregate inside an aggregate which isn’t allowed. You could attempt to use a window function so the max is returned for each row depending on your partitions and then do the count on that value COUNT(CASE WHEN `campaign_id` > MAX(`latest_unsent_campaign_acct`) FIXED (BY…
-
can you calculate the date yourself based on other data in each row? If so you can use an ETL and a formula tile to calculate it. If not you'll need to reimport your data
-
it’s because you have an aggregate inside an aggregate which isn’t allowed. You could attempt to use a window function so the max is returned for each row depending on your partitions and then do the count on that value COUNT(CASE WHEN `campaign_id` > MAX(`latest_unsent_campaign_acct`) FIXED (BY…