Comments
-
If the preview of the document doesn't load (it wasn't for me after I posted), you should still be able to dowload it by hitting the "download" button in the bottom right corner
-
@Abhigyan Michelle's solution should work but I thought I would offer up an alternative that doesn't involve hiding columns. In your case, add Product as the lone column in your table card. Then, place Product in the "Sorting" box and choose the aggregation as: Count. From there, you can add Category as a Quick Filter and…
-
Nice... if you have that ability then sometimes that is just the easiest way!
-
@Jeffsnake Did that solution work for you?
-
Hey @cthtcc , You could name your beast mode something like "SQ and SQ_1 are null?" and use this code CASE WHEN `SQ` IS NULL AND `SQ_1` IS NULL THEN 'Yes' ELSE 'No' END Then drop that into the Filter section of the Analyzer and select only 'Yes' You can test to make sure it works by creating a table card with three…
-
Hey @Jeffsnake I think you should be able to do option 2 in ETL. Here is a screenshot of the tiles you can use: The basic logic there is to set your Date column as Text so that you can then Replace Null values with a "dummy date" like 2299-12-31. Then you append those back to all rows where there is already a date, set the…
-
I can give you a two card approach that will allow you to build a Table that displays MTD Sales next to PY MTD sales on one card and then YTD Sales next to PY YTD Sales. Lets start with Month to Date: First, we are going to create a "Filter Series" that we will drop in the Filters section of the card. The reason for this…
-
I would use a Redshift SQL data flow which will allow you to write Rank & Window functions using Redshift syntax. You can view their documentation examples here
-
@hamza_123 Bummer! What specifically did not work? This is a "live" beast mode that you should be able to drop into any data set to link to Google as a test: CONCAT('<div><a href="https://www.google.com/"','"target="_blank"><font color=#00c200>','Link to Google','</a></div>') If you want only one row to show up, you can…
-
@Ritwik I was able to display black text with your green background using this beast mode (note: I used a column called `Completion Date` so you will want to replace that with your `Actual First Units` field) CONCAT('<font color="black"> <div style="background-color:#98FB98; width: 100%; height:100%; margin:-20px;…
-
@Ritwik @hamza_123 Try using this beast mode format: CONCAT('<div><a href="https://..."','"target="_blank"><font color="black">',`id`,'</a></div>') You should be able to type in other colors where it says "black" or you can use Hex values like this which would display green: CONCAT('<div><a…
-
@user084398 I really think it might be that it was created with settings that have been phased out. That concept of being able to append easily inside of a data flow seems so familiar, but I can't find a way to replicate it. I think you can feel comfortable reaching out to DOMO Support with a link to the data flow and ask…
-
@Simon_King I never got a notification that you responded! Bummer! Okay, I have added a few more points in bold for what will hopefully make it more clear. When you read the steps I outlined, go ahead and type out the SQL statements manually instead of copying and pasting from the Word document. I have shown where if you…
-
Re-reading your post, I realize that you might also be wondering how, from a technical standpoint, the data flow is working. Is that what you were actually wondering about? If so, I think I would also be confused. If the transform for "shapshot_table" is a simple SELECT statement and there is no more code after the FROM…
-
Hi @user084398 If your input data always displays the most up-to-date or "current" view of the data then the benefit of the data flow that you inherited is that you are able to see how certain measurements are changing over time. Let's say you want to show how "Total Forecast Amount" has changed over time. By…
-
Am I correct in assuming that your original data set is updating using the "Append" method? If yes, I think we can get you what you need with a SQL data flow. I have attached a Word document with an example that I hope will help you get the data you need. Let us know if this works or if you need a little more tailored…
-
Someone on our team ended up helping me out: If you need to pull an Excel file from SharePoint, we ended up using the Document Report. Then in Server Realtive URL you use this pattern: /sites/sitename/Shared%20Documents/Folder%20Name/File%20Name.xlsx The sitename can be found by clicking on your sharepoint site and then…
-
@guitarhero23 Do you have any tips on configuring the setup in DOMO to pull Excel file into DOMO through the SharePoint Online connector? I have followed the current documentation as closely as possible but continue to receive "Domo is ready, but value specfied for 'server relative url' is not valid" I am using this…
-
@PhilD Okay - I see now that you can use the connector to "Pull data for all surveys" and that you are wanting to pivot the data based on this big data set. I think all we would need to do is, as @ST_-Superman-_ mentioned, add a unique identifier. To do this, we could combine the Survey ID and Response ID as a new column…
-
@PhilD did this solution work for you?
-
Hey Phil, I was able to create a data set with one row per response for a Qualtrics "Survey Responses" report using the settings in the attached PDF. In your new output data set, you can use a beastmode to calculate number of responses: COUNT(`Response ID`) and then Department should now be a dimension that you can drop…
-
I was curious about Redshift run times being inconsistent and posed a question on DOJO day. Here is the question and response: https://dojo.domo.com/t5/Beast-Mode-ETL-Dataflow/Redshift-vs-MySQL-vs-ETL/m-p/38009 I tested one of my data flows that had the same requirements you have (select all columns, but convert the…
-
Hey Olivia, I updated my data set to include a date column and am attaching an example of a window function you can use to select the maximum value for each day (which should be the "Sent" value") and ascribe it to each row within that day. If you send multiple emails per day, you could consider partioning by 'email_name'…
-
Sure! So in this case, we will need to add a column to our original data that stores the Total Sent value next to each value. This will enable us to perform a calculation on each row where we can divide the metric value by the Total Sent value. I've attached another PDF of how I did this using a Redshift transform and how…
-
I believe that Beast Modes can only perform calculations on rows, which means we may not be able to display your data in a vertical table perfectly. Depending on how your data set is structured you have a few options... See the attached PDF for scenarios I have drawn up for you.
-
If this is a data set that you are analyzing, it seems like you could utilize the ROW_NUMBER function. When you use the function to order by date in ascending order, it would make your earliest date have a row number of 1 and count up from there. Additonally, if there are batches of data you are pulling in, you could use…
-
I think your bet bet to figure it out is to continue to experiment with additonal columns to figure out where the system is changing your dates. Check to see what happens when you write: CAST(a.date_created as datetime) as 'Date Set' and then run the data flow. This should give you a hint as to what happens when you use…
-
I just recently discovered a "Domo Governance Dataset" within the list of connectors. It looks like there is a Beast Modes data set that could contain this information. However, it looks like you can only access the data if you have administrator acces.
-
Is it possible that your file name has changed, or that the tab name within a file has changed? I would check the Details section to ensure that the specifications you entered for the file are still correct.
-
This may be occuring if you have your Company Time Zone Settings set to your local time. If this is the case, then when you begin using native SQL commands, it treats the time within those commands at UTC time. If you are West of the Central Time Zone then you are at least -7 hours from UTC, so in your example the system…