Comments
-
You might want to look into Segments to show the total but not have your filters apply to the totals. You can read up on Segments here: https://domohelp.domo.com/hc/en-us/articles/4403089503383-Creating-Segments-in-Analyzer
-
There is a Domo Governance Datasets connector which has several datasets including one on beast modes. You can pull that dataset and filter to a specific dataset ID. You can read up on this here: https://domohelp.domo.com/hc/en-us/articles/360056318074-Domo-Governance-Datasets-Connector
-
You're using backticks (`) for your 2021 value and not a single quote to denote it's a string (') CASE WHEN `end_date__c` >= '1/01/2021' and `end_date__c` <= '12/31/2021' then '2021' else '0' end
-
You'll need to pass in a specific value with the pfilter for it to work properly. You can use the row's value with the specific column (Campaign Name in your case) and concatenate it with your URL. There's an example of this here:…
-
Hi Hal, you can use pfilters for this and dynamically create your URL based in a beast mode based on the fields you want to filter. Having the pages embedded isn't a requirement.
-
Alternatively you could store three values in a lookup table dataset and use an Etl to left join it to your original data. Then use a formula tile to do a coalesce to set the default value (your else clause) so you keep your lookup list in a single place rather than possibly across multiple beast modes.
-
Instead of using ELSE WHEN just use WHEN for additional cases. Use just ELSE for your default clause.
-
You can utilize a Rank & Window tile (see https://domohelp.domo.com/hc/en-us/articles/360044876094-Magic-ETL-Tiles-Aggregate#3.) within Magic ETL to select the LAG value of your column. Domo does offer a Data Science suite of tiles which includes the outlier detection tile to calculate standard deviation but this is a…
-
This isn't possible as you can't aggregate a window function within a beast mode. What you'd need to do is perform your lag in an ETL and then calculate the average in your card. You may lose some filtering abilities as it's going to be pre-aggregated.
-
Use an ETL to group your tickets dataset based on the identifying column (id / name etc) and do a count on your ticket id field, then use a JOIN to join that with your User table based on your identifying column
-
When you say 47 rows and 14K records - do you mean 47 columns and 14k rows? Have you confirmed that each row has the same number of fields?
-
Use DataSet views. It’s the same underbelly but views have more options and eventually fusions will be deprecated. Converting them shouldn’t have any issues as it’ll keep the same ID just stored in a different format. I haven’t had any issues swapping for a view.
-
You're getting a blank because your CASE statement is likely returning a NULL value. Are you displaying you data split by anything other than Company?
-
Earliest date for the year MIN(MIN(`Date`)) OVER (PARTITION BY YEAR(`Date`)) Latest date for the year MAX(MAX(`Date`)) OVER (PARTITION BY YEAR(`Date`)) Remove the partition clause if you want it across your entire dataset.
-
That's how the data is displayed. Text and dates are typically left justified and numbers are right justified. There aren't any extra spaces or padding added automatically (not without possibly having spaces in your actual data) Do the IDs exist in both dataset? Are they both the same datatype (numbers?) Have you manually…
-
Yeah that sounds about reasonable. The one caveat to recursive dataflows is the don't scale the best as the larger the dataset grows the longer it will take to run the ETL (more data to transfer means more time).
-
Correct. Looking at the history of the ETL and the history of the DataSet both. Once the ETL finishes running does the DataSet history say anything before it finishes like "Indexing..."? It may be a case where the ETL finishes processing the data but then Domo still is working on storing the data and preparing it for…
-
Does the dataset show that it's finished or is it still stuck on another step when you look at it through the UI?
-
You can define a beast mode to conditionally set the context description and then drag that beast mode to the tooltip section o. The tip of analyzer. You may need to click the tooltip icon on the top to display the tooltip option. Then in your hover text setting on the card you can include the TOOLTIP1 variable in the text
-
Correct, you'd only pull in the changes that you'd need to be applied to keep your data processing quicker (less records). You'd need an initial pull of all your data to establish your baseline but then can just pull in the records that changed since the last time you've run it.
-
You can configure a custom report with the GA connector to select the exact dimensions and attributes you're looking for.
-
It’d be easier to pivot your data in an ETL and then use a formula tile or beast mode to calculate the difference between the two column values.
-
Currently no, this isn't possible but I'd recommend adding this as an idea to the idea exchange to get this functionality added.
-
It's not a very well documented feature, I just happened to stumble across it when attempting to come up with a solution for you. If you could accept the answer so others will find it I'd appreciate it.
-
You can use the ERROR function to cause the dataflow to error out with a specific message in a formula tile. CASE WHEN `Number` < 0 THEN ERROR('Number is negative') END The one caveat to this is that you have to save the ETL immediately after you enter this formula otherwise if it attempts to validate your ETL it won't…
-
CASE WHEN `Date` + INTERVAL (7-DAYOFWEEK(`Date`)) DAY = CURRENT_DATE() - INTERVAL DAYOFWEEK(CURRENT_DATE()) DAY THEN 'Last Week' ELSE 'Not Last Week' END To add some additional context I'm adding a specific number of days to the date field to get to saturday (7) of that day's week then comparing it to last saturday…
-
I'd recommend importing your historical file into Domo, then creating an ETL to simply output your historical dataset to a dataflow dataset. Then you can remove the historical input and convert your magic ETL to be a recursive dataflow to then update any new records or update any records in your existing dataflow dataset…
-
Likely you're swapping the order of the parameters. If you reverse the order it should get you a positive number. The order is backwards where it expects the end date as the first parameter and the start date as the second parameter.
-
I’d recommend reformatting your dataset to use a custom date offset dimension dataset. This will allow the interactions with the page date filters and simplify your data structure. I’ve done a write up on this previously here:…
-
Currently this isn’t supported with a Gantt chart. You could uses drill path possibly to go to a table card which you can display and use a hyperlink to then open your project.