コメント
-
Hi @ozarkram What errors are you receiving? Have you use back ticks (`) in your beast mode? Case when `Plant` ='Celes' and dayofweek(`orderdate`) in (2,3,4,5) then '50' when `Plant` ='menes' and dayofweek(`orderdate`) in (2,3,4,5) then '90' end
-
Hi @shielmunroe, Reach measures how many unique people saw your content. This is calculated on Facebook's end. Whatever granularity you pass to Facebook (Month, Day etc) is what it uses to calculate the uniqueness. So if I happen to view your content twice in a month on two separate days I'd have 2 records of 1 for each…
-
Hi @thwartted You can't aggregate an aggregate in a normal beast mode. You'd need to utilize an ETL to aggregate your data with a count by whatever grouping you wish and then use that dataset to calculate the average in the beast mode in your card (or alternatively just calculate the average after your count aggregation in…
-
Hi @RSViju How is your data structured? The filter cards on your dashboard should apply both filters to your dataset.
-
Hi @ozarkram I'd recommend configuring a date dimension dataset to easier track YOY differences / comparisons. I've done a write up on this topic in the past which you can find here: https://dojo.domo.com/discussion/53481/a-more-flexible-way-to-do-period-over-period-comparisons#latest You can then utilize a beast mode to…
-
In workbench you define the upsert key in the job configuration. Are you wanting to enable upsert on a dataset which has duplicate upsert keys already? Each new job would need to have the upsert key defined as each dataset would have different keys.
-
Create a beast mode and just add your values together: `Field 1` + `Field 2` + `Field 3`
-
No as Domo would see them as two completely separate servers.
-
Ah, embeds are unique to the user who created it within Domo so you'd have two separate embed codes since you're using two separate accounts. Not sure why Domo did it that way but that's the current behavior.
-
Depending on how technical you are you could utilize a Python script and the pydomo package to upload your CSV data straight to Domo. If you have a computer in which you have Workbench installed but don't want to install it on your other computer you could utilize a shared drive which the server with Workbench could read…
-
Hi @Raymond You can now find the limit rows field under the Filter & Sort section.
-
Certification only applies to cards and datasets so changing the logic of a dataflow won’t affect anything unless you change the schema of the output dataset that is certified. There is no downstream decertification that occurs if an upstream dataset is decertified. I’d recommend suggesting the downstream certification in…
-
it seems like it should be the same embed ID if you’re logged in as the same user. I’d reach out to dojo support to see if it’s a bug
-
You can use a beast mode to add all three values together and plot that as your fourth value
-
Have you looked into the activity log dataset from DomoStats? That’ll tell you everything that happens in your instance.
-
I'd recommend looking into the Java CLI Domo offers. It contains a swap-owner command which allows you to change the ownership of alerts, cards, dataflows, datasets, groups, pages and reports. https://domohelp.domo.com/hc/en-us/articles/360043437733-Command-Line-Interface-CLI-Tool
-
I just posted something for a similar question but you can use the calendar.csv dataset from the Domo Dimensions connector. Then you can use MySQL to get your desired output (MagicETL doesn't support conditional joins like this directly. You'd have to do a cartesian join [add constant column to both - say a value of 1 -…
-
@AJ2020 Because of how Domo interprets your data on a row by row basis you'd need to have a record for every day (or month depending on how granular you want to get with your data). You can do this in an ETL. MySQL would logically be simpler because you can do conditional joins but MagicETL may be quicker. To start you…
-
When you're filtering it's removing records from your dataset and so there's less values causing your percentage to increase. I'd recommend looking into segments to calculate the total across your dataset without having it be affected by filters. You can read more on them here:…
-
You can use HTML coding in a beast mode to accomplish this. I’ve written about this in the past here: https://dojo.domo.com/discussion/54552/dp22-using-beast-mode-to-build-data-storytelling-links-and-images/p1
-
I did. Apologies! If you could accept the answer so others can made easily find it out they have the same question I’d appreciate it.
-
Definitely log a ticket with support and outline the impact of it being slow. They tend to be sensitive about slowness.
-
Have you tried converting your series directly to a category after you import it? raw['column'] = raw['column'].astype('category')
-
In the Domo Governance Dataset Connector there's a Card Fields and Beast Modes report which should list all the fields used on a card for you. You can then take that data, group by the dataset and field and count the distinct number of card IDs to see how many cards are using the field.
-
Try converting the series itself as a category: raw['VEEVA Entity ID'] = raw['VEEVA Entity ID'].astype('category', errors='raise')
-
According to that documentation you should be able to supply it as a GET parameter with the value of a cvs list of fields you wish you average: var dataAggregateBy = 'day'; var dategrain = [dataDateColumnName + ' by ' + dataAggregateBy]; var query =…
-
You should be able to apply a page filter on the page to see if a date field is in between two other dates however you won’t be able to use two filter card to do this as they filter rows off data based on the exact match.
-
ETL Formula tile doesn't seem to want to handle single quotes correctly. I've been able to get it to remove single quotes using the Replace Text tile but you may have to put in the single quote a couple times as I've had it remove the single quote and replace my single quote with empty string. You can copy the JSON code…
-
Hi @ozarkram Views aren’t supported as an input dataset into MySQL data flows. You can only use them as inputs in the new MagicETL.
-
Have you tried using the same account for both connection methods and confirming the data you're getting back is the same or different?
