Comments
-
Converting to the new upsert will allow your dataflow to run quicker, as you'll only pull in the data that you're updating instead of the entire dataflow. With a recursive dataflow, it starts to run slower over time as it needs to pull in the ever-increasing dataset. It also simplifies the logic of your data flow. I'd…
-
SUM(`quantity` / SUM(1) FIXED (BY `product_id`)) Use a fixed function to determing the number of records you have for each product, divide the quantity by that to get fractional quantity and then SUM it all together to get the overall quantity.
-
You’d need to create another beast mode with a case statement to return a numerical value for the day and order you want CASE WHEN DAYNAME(`date`) = 'Sunday' then 8 ELSE DAYOFWEEK(`date`) end DAYOFWEEK returns 1-7 for Sunday-Saturday the case statement just shifts Sunday to 8 so it’s last Put this beast mode in your…
-
Have you tried leveraging the domo.onFiltersUpdate()? https://developer.domo.com/portal/e947d87e17547-domo-js#domoondataupdate
-
For reference here's a list of things that are supported in an HTML table. JavaScript isn't supported.
-
This is because your beast mode is returning NULL for some records (where none of the conditions are met). You can filter out the NULLs and it will remove it from the chart.
-
Have you tried configuring the Pagination settings on your No Code connector? It appears that Rippling uses a next url link which you can configure in that section. You'll need to select "Get next URL from result" and then use the interface to determine how to parse out the next_link field for Domo to use to get the next…
-
You can try and use a window function to compare the row numbers to conditionally add the first row as your data CASE WHEN SUM(1) OVER (PARTITION BY `ProjectId` ORDER BY `BatchTimeStamp`) = 1 THEN YOUR_AMOUNT_HERE ELSE 0 END
-
Try /data/v1/leads instead of /data/v2/leads
-
@MarkSnodgrass you rang? @Kms https://INSTANCE.domo.com/api/social/v4/alerts/series?cardId=CARD_ID POST will return what's used to populate the alert modal. You'll need to play around with the body but should look something like: {"filterGroupIdArray":[ID_NUMBER]} I'd recommend monitoring the traffic for each card to see…
-
There would be as you're querying snowflake so you'd have the Snowflake querying costs and the Domo ingestion costs if you're on Domo's consumption model.
-
In your manifest file you should have an alias defined which is mapped to a specific dataset. You'll want to make sure you're using that alias instead of leads
-
As long as you create new workbench instances and aren't "migrating" worbench servers it'll create different jobs. The one thing to look out for is if you're updating the same dataset on different workbench accounts, that's when you'll get some confusion but if all the jobs and datasets are separate then you'll have no…
-
I'd recommend stacking your data so you have two copies of the data in your dataset. One being the original dataset and the other being the data having the sensitive fields masked or anonymized. You can then add a column, Anonymized for example, with two values Yes and No. With the No being set for the original data and…
-
You can do this with a window function and unix_timestamp UNIX_TIMESTAMP(TIMESTAMP(`Timestamp`)) - UNIX_TIMESTAMP(LAG(TIMESTAMP(`Timestamp`)) OVER (ORDER BY TIMESTAMP(`Timestamp`))) TIMESTAMP converts your string to an actual timestamp UNIX_TIMESTAMP returns the number of seconds since 1/1/1970, taking the difference…
-
You could use a recursive type dataflow where you take your existing output dataset and use it as your input dataset in the same dataflow, update the calculation, and then save it back to the output dataset. Here's some documentation on recursive dataflows:
-
Do you have the same dataset powering both Dashboard 1 and Dashboard 2? Domo Dashboard's don't communicate with each other so unless you have the same dataset powering both dashboards you won't be able to do this.
-
You can use SuiteAnalytics but that's an additional cost from NetSuite to allow you to pull that way.
-
You can use a Rank and Window tile with LAG to get the prior value based on your sorting.
-
I don't believe the Domo Brick currently allows for collection syncing to a stored Dataset. You'd need to use a custom app and set the syncEnabled property in your manifest file:
-
How far does your data go? If you want to see the future months you need to have those months in your dataset.
-
TBA uses the newer 2.0 version of SuiteScript and setup is a bit easier.
-
You can use the Campaigns App:
-
You can follow the help doc outlining the steps to take for the TBA connector and how to get these pieces of information:
-
Make sure you have the proper IP addresses whitelisted for Azure and Domo to communicate. You can read more about this here:
-
In order to more easily manage and group cards it'd be great if there was a way to tag the cards and save the searches / filters for those tags like we have with datasets.
-
If you name the fields the exact same and they have the same format / values then when you filter the page it'll filter both cards. You'll need to transform your data to either have a new column or format your existing column to have the same format
-
You could try to use a window function to get the total for each bucket/partition and then filter based on that number. COUNT(DISTINCT `Feedback`) OVER (PARTITION BY `Country`)
-
I'm not certain window functions are available in the SQL tile. You can use a Rank and Window tile after you select your resulting table from your SQL query.
-
You'd need to split out your data or cards your alert is based on. Currently it's not possible do this after PDP is applied.