Comments
-
You can change the Position of the text and Rotate settings under "Data Label Settings" properties on your chart in Analyzer.
-
The documentation is a bit buried but this may be of help to you to explain how the lastvalue parameters work in Workbench:
-
Can you provide a sample CSV or excel file (with just the dates with their field names, and no other identifying information)?
-
I don't think there's an option but it'd be a great idea for the Idea Exchange.
-
Twitter has recently rolled out its new v2 API which essentially paywalls API access to read the data. I don't believe the Domo connector has been updated to use this new one yet. https://twittercommunity.com/t/announcing-new-access-tiers-for-the-twitter-api/188728
-
I missed that in your screenshot but you did have the backticks. The forums interpreted it as a code block when it was pasted in. 2,4,8 and 10 are comments where you can add additional information for context but won't be evaluated, that's what the two dashes represent as part of the beast mode. What I'll do when…
-
That's a great first attempt. What you're missing is surrounding your field names with backticks (`) to show that they are fields in your dataset and not actual beast mode code syntax. CASE WHEN `Actual Business Go-Live 1 Date_p` IS NULL THEN —- If Initial Business Go-Live 1 Date_p IS NULL = 'On-Time' CASE WHEN `Initial…
-
My understanding is that Workbench is built off of the .NET platform and not the Java platform so you should be ok.
-
Correct, this is because the page is locked to prevent editing from anyone except for admins and the page owner (the lock symbol next to your Keyword Performance title).
-
Not directly. There is a Domo Stats dataset for Workbench which has higher-level information. You can export all of the workbench jobs and their definitions in JSON format on the server itself. You can use the wb.exe executable in your Workbench installation with a command prompt to export all of the jobs and then take…
-
Partitions are used to replace a group of records typically. If you have some records being replaced for a date then pull in all records for that date and use the date as your partition key. This will replace all records with that date. Alternatively if records aren’t being deleted you can utilize UPSERT in workbench to…
-
Separate variables would be a good use case here. Two for the stats and two for the ends and then just compare within a beast mode if the date is between what was supplied
-
Yes, you'd need to utilize a group by function. You can either use a function in the group by or use a formula tile to put your CASE statement in and then do a distinct count with a group by tile.
-
Are you keying your admitted based off of STATUS or and ADMITDATE field? What issues are you running into with your beast mode? It looks like you're missing the trailing ) in your beast mode after the END.
-
This isn't a possibility with a beast mode as they'll have access to the dataset that's powering the card. You'll need to do the masking within an ETL and utilize PDP to filter to filter only the rows that are allowed. You'll then need to add an additional field to your dataset to identify masked rows to allow users to see…
-
Currently the table is the only one that will work. If you want you can get more control of the visualiation utilizing a Domo Brick which would support circular images.
-
I think you may be missing a double quote after src=
-
CASE WHEN `Actual 1` IS NULL THEN —- If Initial is blank = On-Time CASE WHEN `Initial 1` IS NULL THEN 'On-Time' —- If Adjusted <= 25 days after Initial (or earlier than Initial) = On-Time. WHEN `Adjusted 1` <= DATE_ADD(`Initial 1`, interval 25 day) then 'On-Time' —- > 25 days = Overdue WHEN `Adjusted 1` > DATE_SUB(`Initial…
-
I'd recommend looking into TIME_TO_SEC function to convert your string into total seconds then you can divide that number by 3600 to convert to it to hours and then SUM it. TIME_TO_SEC Returns an elapsed number of seconds for all values in a date/time column. TIME_TO_SEC('DateCol')
-
Should be something like below: CASE WHEN `field` = 'Bucket 1 Value' THEN `value` END You can then use the aggregation options if you want to do a sum / min / max / avg. Just make sure you change the logic for your two different buckets
-
You can set your y-axis to be your total number / metric and then use two beast modes to calculate each of your two buckets for your bar, dragging each beast mode into your series section of your chart.
-
If you do a DOMO.log in the connector IDE code, is it returning the same JSON you're seeing with Postman?
-
Thanks for posting your solution @SeanPT ! Glad you got it figured out.
-
What format is your data in? Are you attempting to store all of the values within a single column or are each response a separate record in your dataset? How are you seeing the 1.5 value and the 3k value?
-
You can utilize your variable and select the different fields depending on the value of your variable: CASE WHEN `variable` = 'Date Value' THEN `date field` ELSE `category field` END Toss that beast mode into your sort and select sort descending.
-
Your AE would be the person to help you get an answer to this. I'd recommend reaching out to them.
-
Not exactly pull out a match but you can use REGEXP_REPLACE to fin the string and remove everything else from it. REGEXP_REPLACE(`value`, '^.*(MATCHCODE).*$', '$1')
-
Are there any other sort of identifiers on how to extract the code? Is it always [space]AAA[space] for the format? Would that ever appear multiple times in the string?
-
Are the pfilters getting set correctly and have you validated the format of passing in the pfilters? Have they been url encoded?
-
Currently this isn't an opiton. What you could do is utilize a beast mode to check to see if they have Prelim selected and then return a string to use on a text box card to display a warning above your variable control letting the user know 3rd party isn't available.