Comments
-
You cannot but I’d recommend adding a new idea to the idea exchange As a workaround I’d recommend writing your query in an actual database IDE and then paste it into workbench
-
I'd recommend logging a ticket with Domo Support as they were having API issues last night and this may be related
-
What I recommend is creating your own date dimension dataset with your defined periods (FY and NextFY). This will allow you to have a single date you can then use in your chart and conditionally use a beast mode to display either FY or Next FY. I've done a writeup in the past on this for period over period analysis you can…
-
This appears to be a system-wide issue. I'd recommend logging a ticket with Domo Support. I'm having the same issue as well.
-
There isn't a way to spread out the data labels currently as they're tied to the end of the lollipop. You can submit an idea to the Idea Exchange to improve label positioning and spacing. You can make the chart longer or display less data so it has more space to display the labels and lollipops.
-
What issue are you experiencing? How is it not working? Have you tried splitting out your conditions into separate beast modes and using a table card to see the values for each row of data?
-
You can use a Formula Tile and call your new field 'Completed' and use a CASE statement as your formula: CASE WHEN `Screening Date` IS NULL THEN 'No' ELSE 'Yes' END
-
You can use a beast mode to calculate the average AVERAGE(`Loan Amount Field`) Then you can use it as a summary number or put it into a tooltip field and reference the tooltip in either your data label stubs or hover text settings
-
@ArborRose The last column is showing 1 / count because it's the summary row on a table, it's not data within the dataset. I agree with @ST_-Superman-_ a Fixed function filter should work to have you only see sites with >= 3 tickets.
-
CASE WHEN DAYOFWEEK(CURRENT_DATE()) IN (1,7) THEN 'Weekend' ELSE 'Weekday' END
-
Are you attempting to utilize a single field in your recursive dataflow join for your key field? You should be able to select multiple key columns which uniquely identify each record. Just make sure you're joining on all of your key columns in the join
-
You're correct that it's doing it on a per-row basis. If you want the minimum across your entire dataset, you can use a window function: MIN(MIN(`Report_Date`)) OVER ()
-
Hi @alyssamanse ! Through the UI there isn't a way to bulk add a transform to each of the jobs. The command-line wb.exe also doesn't allow for updating jobs. It's a manual process currently.
-
Domo does have Beast Mode reference in the pipeline, where you can reference one beast mode from another beast mode. I don't know when it'll be released. However, they did tease it last Domopalooza, so maybe soon. You could then define your "function" as a beast mode and then reference it within your other beast modes on…
-
How are you setting the value of inputData? Are you querying your dataset? Is your column data an array or is it set to nothing?
-
You could do a replace method update on your S3 dataset and then feed it into a MagicETL dataflow, which outputs to another dataset, but you can set the output method to partition and define your partition key. This would get around the merge issue you're running into. As for the merge issue itself how many merge keys are…
-
DDX Bricks are pre-compiled Domo apps. Because of this the dataset names are pre-defined in the manifest file used to generate the bricks. These can't be changed in a DDX brick. If you want to go an additional level deeper in the development stack you can do a custom app which allows you to define the datasets and names…
-
I'd recommend logging a ticketed with Domo support for their development team to look into.
-
This hasn't been released to General Availability yet. I'm hoping it'll be released soon, maybe with the Domopalooza release. You may be able to reach out to your CSM to see if it's an available beta for you to have enabled in your instance.
-
I believe the ID should be unique. @NoahFinberg - can you confirm?
-
Create DDX Apps requires the Create DomoApps grant as well. The grant permissions window is unclear. Saving data from the app would require AppDB permissions as well.
-
This is likely caused by a missing common algorithm used to encrypt the traffic between the SFTP server and WB (they can't speak the same language). You may need to install other encryption algorithms on your SFTP server to allow WB to communicate. Domo Support may be able to tell you what encryption algorithms are…
-
Without seeing each of your bar values, I believe the chart line is taking an average of each percentage/bar in your chart, but the summary number is taking the percentage across the entire dataset.
-
Likely, you have multiple records for the same SalesOrderNo value so when it's doing it in aggregate it's doing the distinct count across all superintendents instead of adding each individual distinct count per individual superintendent.
-
Because your aggregation is within your case statement, it's getting treated differently. Try this instead: COUNT(DISTINCT CASE WHEN HdrParentItemCode = 'ZPUNCH' AND JT158_WTParent = 'Y' THEN sonum_wtnumber END) / COUNT(DISTINCT CASE WHEN JT158_WTParent = 'Y' THEN SalesOrderNo END)
-
Looks like you're using %22 which is a " character which would end your HREF property on your A tag. Try using single quotes instead (%27)
-
If you right click on the page and then inspect. Click on the network tab. Then reload the page and see if any of the requests come back with an error / red. This may give you more information into why it's failing to allow you to edit the brick.
-
How you solve this will depend on your dataset and if you're recording individual days if they pass or fail or just the failures. Do you have some sample anonymized data to help illustrate this for us?
-
I don't believe this is possible within a Brick due to the security concerns around the storage and usage of account credentials.
-
The issue is that your metadata variable is undefined and you need to handle the Promise. I'd try to do something like: domo.get(metadataQuery, {format: 'array-of-arrays' }).then(function proces_metadata(metadata) { fields1 = metadata.columns … domo.get(query1).then(…) } )