Comments
-
Here's what I have for Code Engine: Function with Input: Output: Test: Instead of manually inputting the PageIds like you did in your video, I enter the results from the query Results: However, it works perfectly when I manually input the PageIDs, as you demonstrated in the video. When I deploy the workflow with the same…
-
@DanHendriksen Sorry I misspoke. I created a new package using the function provided. I added that function after the query sql. I used the output of the query as the input to the function. This results in the following error: { "status": 404, "statusReason": "Not Found", "message": "This function FAILED. The return was…
-
Wow! Thanks @ArborRose! I'll give this a try
-
@DanHendriksen Not yet…It looks like I'm gonna need to get a little more creative with creating the list. The function chat GPT provided couldn't be used in the tile request the list/number for pageID.
-
Thanks for the response @GrantSmith
-
Results using queryWithSql: { "results": [ { "PageID": "198669628" }, { "PageID": "809811304" }, { "PageID": "1632530999" }, { "PageID": "1235494000" }, { "PageID": "1465556204" }, { "PageID": "1010091898" }, { "PageID": "1637505344" }, { "PageID": "756469963" }, { "PageID": "1425968126" }, { "PageID": "103381472" }, {…
-
Hey @DanHendriksen Thank you for the reply. Using the "Loop Over Dataset" template: My current query is "SELECT CAST("Page ID" AS LONG) FROM dataset WHERE "Owner User ID" NOT LIKE '%1153317885%'" which creates a list/object output titled datasetRows I'm not sure what to do on the "Get the Current Row" portion but when I…
-
Huge shoutout to @DanHendriksen for taking some time to walk me through this. I used the code he provided above and created a new package. One small line of code that didn't get copied over was defining 'codeengine'. He scheduled a call with me, provided the missing code (const codeengine = require ('codeengine');), and…
-
Thanks @DanHendriksen So…I'm going to need some hand holding here since I'm not as smart as you. I use fetchGroup with input "BAC Group" which creates the output "group". This generates the following object data: { "group": { "id": 1806246612, "name": "BAC Group", "type": "closed", "userIds": [ 1496850857, 162776153 ],…
-
@DanHendriksen For the kids in the back of class…is there a resource that helps explain how to create a list of persons from the returned ID's?
-
I found a solution to this issue. First - Huge shoutout to @DashboardDude who took time out of his busy schedule to meet with me and help me on my way to a solution. To simplify, I had a beast mode similar to SUM(Priced Items) / SUM(Pounds Shipped). @DashboardDude noticed there were several beast modes like this but…
-
@DashboardDude
-
@DashboardDude I verified the fields are numeric, for the most part. Attached you'll find the screenshots you requested. Thank you in advance for your help
-
-
@ColemenWilson The green blocks contain data
-
@ColemenWilson Thanks for the reply. Those are unchecked but we still have the same issue. We also noticed this happening to our subtotals on another card…
-
Thanks @GrantSmith . This provides the majority of the information I'm looking for which is why I marked this as answered. It would be nice if it included group schedules and job id. Thanks again!
-
@Zoolander Sounds good. Thank you for the help!
-
@Zoolander I'd be interested in learning about that end point idea of yours @MarkSnodgrass Thank you for the reply. I'll take a look at the Domo Workbench Enterprise
-
@rado98 Make sure you run workbench as an administrator and you're logged in using the same user who created the backup file. An issue we ran into after doing this is workbench duplicated our datasets. The solution to fix this is to map the duplicate datasets to your original datasets. To map your datasets, you'll navigate…
-
@GrantSmith could you break that down Barney style for me? We're seeing access in the users and groups on the server machine. Is that what you're referring to?
-
Yup - they have the grant
-
Well, it's September 2023 now. Looking for the same solution so I guess it hasn't been fixed yet