コメント
-
@user014054 - You need to give your output dataset a name under the configuration tab. Click the "Output Dataset Name" and name it however you'd like.
-
Another option would be to utilize the replace text tile and use a regular expression: ^.*(\d{1,2}\/\d{1,2}\/\d{4}).*$ You'll put the above regular expression in the step in box 2. Also make sure to click the gear icon and select "Use RegEx". In step 3 place $1 as the replacement text.
-
You could also just treat it like an actual decimal number and utilize the following BeastMode: ABS(HOUR(TIMEDIFF(`lte_donation_phleb.stop_time`, `lte_donation_phleb.start_time`)) + (MINUTE(TIMEDIFF(`lte_donation_phleb.stop_time`, `lte_donation_phleb.start_time`)) / 60))
-
Try this: CONCAT( HOUR(TIMEDIFF(`lte_donation_phleb.stop_time`, `lte_donation_phleb.start_time`)), '.', CASE WHEN MINUTE(TIMEDIFF(`lte_donation_phleb.stop_time`, `lte_donation_phleb.start_time`)) < 10 THEN CONCAT('0', MINUTE(TIMEDIFF(`lte_donation_phleb.stop_time`, `lte_donation_phleb.start_time`))) ELSE…
-
@user11590 - You could utilize CONCAT to combine the different data elements. Note: The following code is untested. I'm utilizing your same logic for the dates to only display the dates where it's matching the logic of the revenue $ CONCAT( MIN( CASE WHEN DAY(CURDATE()) >= 1 AND DAY(CURDATE()) <= 18 THEN CASE WHEN…
-
@imelendez - @ST_-Superman-_ has it correct. It's a simple case statement to handle the nulls and replace it with 0. Depending on your aggregate level you may need to wrap it in a SUM() (or do this on the sorting section of the card building interface)
-
That's functioning as intended. NULLs always appear last. The actual data should appear after the nulls decending. What sort of behavior are you expecting? If you want to override the sorting behavior you can utilize a separate beast mode to handle the nulls differently and sort based on this new field rather than the…
-
@user14867 - You're getting a bunch of extra records because you're doing a large many to many cross join. What you'll need to do is reduce your dataset first before you do your join so you only have one record. It should just be two simple group by widgets. They will both spawn from your original dataset. The first Group…
-
I don't think there's a way to do this since the annotations are stored on the date level and not any filtered sublevel.
-
You'll want to reach out to your Domo contact for assistance.
-
@SusieQ - Have you installed the package on the Salesforce side for All Users? When you're connecting Domo to your Salesforce instance are you using the sandbox connection or your production environment connection?
-
Currently Domo doesn't have this exact feature. While @Aditya_Jain solution would work it won't allow you to keep other filters for the user to select. There's an idea that's been posted under the Idea Exchange specifically for this request:…
-
Alternatively you could utilize max and a case statement like the following MAX(CASE WHEN `lte_donation_general.hemoglobin2` IS NOT NULL THEN 1 ELSE 0 END) + MAX(CASE WHEN `lte_donation_general.hemoglobin` IS NOT NULL THEN 1 ELSE 0 END)
-
Have you tried adding the count of distinct donation IDs and adding the two together? COUNT(DISTINCT CASE WHEN `lte_donation_general.hemoglobin2` IS NOT NULL THEN `lte_donation_general.donation_id` END) + COUNT(DISTINCT CASE WHEN `lte_donation_general.hemoglobin` IS NOT NULL THEN `lte_donation_general.donation_id` END)
-
@SGuglietta Have you tried a different browser or logging out and back in? It seems to work fine for me.
-
@DP_BA - Can you utilize the ad_id or are you pulling your facebook data on a different dimension? What Reports do you have selected and what are the report edge / level are being utilized?
-
If you know all of the filter combinations ahead of time you can create a beast mode for with case statments to then return the "filter name" and add a quick filter based on those predefined options =Sunshine CASE WHEN lower(`field`) like '%sunshine%' 'Yes' ELSE 'No' END =Mountain CASE WHEN lower(`field`) like '%mountain%'…
-
Great! To clarify here you needed to filter the data before applying the SUM function. In your original one it was essentially taking the sum of all the records if one of them was for Actuals. Moving the SUM to the outside of your case statement caused the data to be filtered first and giving you then desired result.
-
@mamedu- Try this: SUM(CASE WHEN `transaction_type` = 'Actuals' AND ( ( YEAR(`transaction_date`) = YEAR(CURRENT_DATE()) AND MONTH(`transaction_date`) < 7 ) OR ( YEAR(`transaction_date`) = YEAR(CURRENT_DATE())-1 AND MONTH(`transaction_date`) >= 7 ) ) THEN `net` ELSE 0 END)
-
Have you confirmed the value of 'Actuals' in your data? Does it have any trailing whitespace? Instead of = 'Actuals' have you tried doing `transaction_type` LIKE '%Actuals%' Here's a rewrite of your logic but it appears correct glancing over it. CASE WHEN `transaction_type` = 'Actuals' AND ((YEAR('transaction_date`) =…
-
Interestingly enough my initial response was deleted as well. Here's what I initially wrote: @CurtisS You can do it with a MySQL transformation. I plugged your example dataset into the gs_test_data_set referenced below. I utilized two Transforms but you might just need 1 with a seprate dataset. One transform was just…
-
@CurtisS You can do it with a MySQL transformation. I plugged your example dataset into the gs_test_data_set referenced below. I utilized two Transforms but you might just need 1 with a seprate dataset. One transform was just generating a list of all of the products. (I called this product_list) select 'Product A1' as…
-
@WizardOz - Are you wanting to allow the smaller groups if the total number of users is > 20? One other thing of note you might want to keep in mind is to remove the drill path to the underlying data so that they can't see the small group of users. You'd also want to be careful with what's being graphed so the small user…
-
Do you have an example of your like statement or how it wasn't consistent? Like should work to determine if the order season is in your selling season CSV string. CASE WHEN `Selling Seasons` like CONCAT('%', `Order Season`, '%') THEN 'Y' ELSE 'N' END
-
If you don't care about keeping a historical record you could do it with the MagicETL - Date Operation to get the Day of Week on your main data set - Split your main dataset using a filter to pull the last 60 days - Group On Day of Week - calculating the average - Join this grouping back to your original dataset (that you…
-
Here's an example: Note: This is assuming you have one record per date otherwise you'll get duplicate records. If you want on a per record invoice you'd need to include another primary key in your joins than just the date.
-
What I've done in the past is join the table twice, once based on the open date, and again based on the closed date. Then I join those two split data sets back together based on Open Date = Closed Date and rename the count field to be Open Count and Closed Count (or whatever you'd like to call it). Then you can graph based…
-
Because you're wanting to do an average of the count (an aggregate of an aggregate) BeastMode won't be able to help you. You'll need to utilize a data flow to track your running total / count and then you could utilize a beast mode to calculate the difference between your current day's value and the average count.
-
@Valiant posted an answer to this question here: https://dojo.domo.com/t5/Domo-Developer/how-to-open-a-standard-card-of-DOMO-in-a-new-tab-from-a-custom/m-p/46655/highlight/true#M1425 for those searching and coming to this page.
-
@CurtisS - How is the underlying data in the datasource structured? That will help determine how to best create what you're looking for.
