コメント
-
I'd recommend switching your logic over to a Magic ETL. MySQL runs on 5.6 which is an older version and due to the nature of those dataflows they run sequentially and can run much slower. I've seen Magic ETL dataflows with the same logic take up to 90% less time to execute.
-
You can, on the dashboard there's a wrench icon in the upper right. Select Save As and it'll ask if you want to duplicate the cards or not.
-
Are you utilizing Color Rules in your chart?
-
Hi @Nandha_Kumar_1 You only have 323k rows, so there are likely some chances for efficiency improvement. When I'm designing an ETL, I like to be a teapot—short and stout. Short—Filter your data as soon as possible to reduce the number of rows. Use a select column or alter column to select or drop columns, respectively, so…
-
I'd recommend logging a ticket with support to have them assist with this as it's an issue dealing with the underlying architecture.
-
I'd recommend logging a ticket with Support as it appears to be a backend issue
-
Currently this isn't possible however I'd recommend adding an idea to the Idea Exchange to enhance the email connector to include the metadata like the subject.
-
For OAuth connections which don't have a dedicated connector I try and utilize the JSON No Code OAuth connector to connect with those APIs.
-
Because you have variable length in your string I'd recommend utilizing a formula tile in a Magic ETL to perform a regular expression. FGI REGEP_REPLACE(`Destination`, '^[^\-]*\-((\w\-)?FGI)-(\d{2}\.\d{2}\.\d{4}.*)$', '$1') Tail end: REGEX_REPLACE(`Destination`, '^[^\-]*\-((\w\-)?FGI)-(\d{2}\.\d{2}\.\d{4}.*)$', '$3') You…
-
Converting to the new upsert will allow your dataflow to run quicker, as you'll only pull in the data that you're updating instead of the entire dataflow. With a recursive dataflow, it starts to run slower over time as it needs to pull in the ever-increasing dataset. It also simplifies the logic of your data flow. I'd…
-
SUM(`quantity` / SUM(1) FIXED (BY `product_id`)) Use a fixed function to determing the number of records you have for each product, divide the quantity by that to get fractional quantity and then SUM it all together to get the overall quantity.
-
You’d need to create another beast mode with a case statement to return a numerical value for the day and order you want CASE WHEN DAYNAME(`date`) = 'Sunday' then 8 ELSE DAYOFWEEK(`date`) end DAYOFWEEK returns 1-7 for Sunday-Saturday the case statement just shifts Sunday to 8 so it’s last Put this beast mode in your…
-
Have you tried leveraging the domo.onFiltersUpdate()? https://developer.domo.com/portal/e947d87e17547-domo-js#domoondataupdate
-
For reference here's a list of things that are supported in an HTML table. JavaScript isn't supported.
-
This is because your beast mode is returning NULL for some records (where none of the conditions are met). You can filter out the NULLs and it will remove it from the chart.
-
Have you tried configuring the Pagination settings on your No Code connector? It appears that Rippling uses a next url link which you can configure in that section. You'll need to select "Get next URL from result" and then use the interface to determine how to parse out the next_link field for Domo to use to get the next…
-
You can try and use a window function to compare the row numbers to conditionally add the first row as your data CASE WHEN SUM(1) OVER (PARTITION BY `ProjectId` ORDER BY `BatchTimeStamp`) = 1 THEN YOUR_AMOUNT_HERE ELSE 0 END
-
Try /data/v1/leads instead of /data/v2/leads
-
@MarkSnodgrass you rang? @Kms https://INSTANCE.domo.com/api/social/v4/alerts/series?cardId=CARD_ID POST will return what's used to populate the alert modal. You'll need to play around with the body but should look something like: {"filterGroupIdArray":[ID_NUMBER]} I'd recommend monitoring the traffic for each card to see…
-
There would be as you're querying snowflake so you'd have the Snowflake querying costs and the Domo ingestion costs if you're on Domo's consumption model.
-
In your manifest file you should have an alias defined which is mapped to a specific dataset. You'll want to make sure you're using that alias instead of leads
-
As long as you create new workbench instances and aren't "migrating" worbench servers it'll create different jobs. The one thing to look out for is if you're updating the same dataset on different workbench accounts, that's when you'll get some confusion but if all the jobs and datasets are separate then you'll have no…
-
I'd recommend stacking your data so you have two copies of the data in your dataset. One being the original dataset and the other being the data having the sensitive fields masked or anonymized. You can then add a column, Anonymized for example, with two values Yes and No. With the No being set for the original data and…
-
You can do this with a window function and unix_timestamp UNIX_TIMESTAMP(TIMESTAMP(`Timestamp`)) - UNIX_TIMESTAMP(LAG(TIMESTAMP(`Timestamp`)) OVER (ORDER BY TIMESTAMP(`Timestamp`))) TIMESTAMP converts your string to an actual timestamp UNIX_TIMESTAMP returns the number of seconds since 1/1/1970, taking the difference…
-
You could use a recursive type dataflow where you take your existing output dataset and use it as your input dataset in the same dataflow, update the calculation, and then save it back to the output dataset. Here's some documentation on recursive dataflows:
-
Do you have the same dataset powering both Dashboard 1 and Dashboard 2? Domo Dashboard's don't communicate with each other so unless you have the same dataset powering both dashboards you won't be able to do this.
-
You can use SuiteAnalytics but that's an additional cost from NetSuite to allow you to pull that way.
-
You can use a Rank and Window tile with LAG to get the prior value based on your sorting.
-
I don't believe the Domo Brick currently allows for collection syncing to a stored Dataset. You'd need to use a custom app and set the syncEnabled property in your manifest file:
-
How far does your data go? If you want to see the future months you need to have those months in your dataset.