Comments
-
Do you have the time zone as part of the datetime data? (i.e. EST, PST, CST, etc,) If so, you can build a case statement to put it in UTC time (or standardized to any time zone that you want). You can use the SUBTIME function to subtract the necessary hours. Here is a list of time zone abbreviations to be looking for.
-
Domo assumes that all times coming in are converted to UTC. Depending on where you are looking at your data, it will show it in UTC time or in the time zone set in your Company Settings. If you are uploading your CSV file via Domo Workbench, you can use the transformation settings to move all datetime values to UTC when it…
-
It would be helpful to see the convert statement that you are using and more clarification as to which ETL tool you are using in Domo (Magic ETL, MySQL, Redshift, etc.) If I had to guess, your this Gross charge column is of type text. You can use the REPLACE function to remove the $ from your data and then try to convert…
-
You can do a calculated field and then use Left and Length to remove the last 2 characters. Would look like this: LEFT('field',LEN('field')-2)
-
Domo does have a Sankey chart you can use. You will find under Other Charts in the chart types. Here's information on how to power it.
-
You might need to provide some sample rows of your data. I took your final beast mode: (CASE WHEN (IFNULL(`qty found`,0)) = 0 THEN 0 ELSE (SUM(IFNULL(`qty sold instock`,0))/SUM(IFNULL(`qty found`,0))) END) And put together some sample rows with the same columns that look like this and beast mode calc worked.
-
According to this KB article, https://domohelp.domo.com/hc/en-us/articles/360057021074-OneDrive-Writeback-Connector it seems like it is contingent on Microsoft's API.
-
I'm not sure there is a limitation in the sql you can construct other than it is using mysql 5.6 and I've read about a limitation of 1 million rows. I agree that expanded documentation in this area would be helpful.
-
I'm pretty sure you will need to determine your max value in Magic ETL first by using the group by tile and then join it back to the main dataset so that you have the date that the max value occurred on for that type.
-
On the computer where my workbench is installed, I see all the log files are stored in C:\ProgramData\Domo\Workbench\Logs and there is a new log file for each date. Uploading each log file would be quite tedious, but I did find a way via command prompt to combine multiple text files into one. The full walkthrough is here:…
-
@joshpropp It is a feature that has been suggested in the ideas exchange and has received a lot of up votes. You can help up vote it here: Unfortunately, it has not been implemented yet.
-
@kivlind no problem! If this worked for you, please mark the answer as accepted so that it can help others in the community.
-
As long as the column name that your clicking on exists in the other dataset, it will filter for you. That means if you click on country name, for example, country name would need to be in the other dataset. You also need to turn on interaction filter on your page filter for the card to card filtering to work. I would also…
-
If you are using the mega table card, you can use \\n to force a new line. Example: CONCAT('Name','\\n','Address') If you are using the HTML table card, you can use <br />, which the HTML tag for line break: CONCAT('Name','<br />','Address')
-
@swagner great use of regex by @GrantSmith as usual. Just to see if I could, I came up with a way to do it in Beast Mode: SPLIT_PART(`String`,'-',LENGTH(`String`) - LENGTH(REPLACE(`String`,'-','')) + 1) To dynamically determine the number of dashes in the string, you can utilize the LENGTH and REPLACE functions. I am…
-
@GrantSmith I knew you were going to suggest regex! 😂
-
If you want to pull 3 different files from somewhere, you have to build three different connections, one for each file. This is the case for any connection type, SharePoint, sftp, Workbench. Each file needs its own connection.
-
@swagner Assuming you have always 3 dashes in there (as shown in your examples) you can use SPLIT_PART to get the string after the 3rd dash, which is the 4th part. SPLIT_PART(`String`,'-',4) Hope this helps.
-
@Jessica You will probably need to provide more specifics as to what fields are in your dataset to work with, but generally I would say yes you can do this without a recursive. In the ETL, you can split off the IDs and dates in your dataset and then group by max date. Then join it back to your main dataset and join on the…
-
I know the mega table limit has been in place since it was created. I don't have specific experience with the scheduled reports limitation, but I would be surprised if the CSV export was providing more data than the card that it is based off of.
-
If you are using the mega table card type, it is limited to 25,000 rows. Additionally, Scheduled Reports CSV attachments have a 5mb limit, but it sounds like the row limitation is your issue. To get around this, you would need to look to export the actual dataset and there isn't a straightforward scheduled option to do…
-
The 1k, 10k, 100k, etc.. are for deciding how many rows you want to include from your input datasets. This is helpful if you have some joins in your dataset and, depending on how your input dataset data is sorted, you might not see any results to your join unless you tell the preview to process a larger chunk of your…
-
Unfortunately, no. The preview window will only let you scroll through 100 rows. It would be nice if it would. Next to run preview, you can change how much data it will pull in (default is 10k rows), but that still won't affect how many rows are in the preview window. You can help your troubleshooting a little bit by…
-
@MichaelClark Try breaking things down. Start with only having the last sync field in the table card and setting the aggregation to max. This should give you one row with the latest sync data in your entire dataset. If you are seeing more than one row, than a setting in Analyzer is causing the issue. The most common reason…
-
@user030156 To your original question, I would advocate for implementing the simplest and most straightforward solution so that others are easily able to understand if they have to manage the card or dataset down the road. Did you try just choosing MAX in the aggregate options when you dragged in the date field into your…
-
You can't edit an uploaded datasets directly in Domo, but there are other ways to change the data. In Magic ETL, you can use various tiles to replace data based on certain criteria. You can also join to another dataset in Magic ETL that has the correct data and use the columns from that dataset in your final output.
-
You could create a beast mode that is MAX('last_sync') . This will give you the latest sync date based on the other columns that you are choosing to display in your card.
-
To determine the 2nd Tuesday of the month, you could use the DAYNAME function and the DAY function. The 2nd Tuesday of the month can only occur between the 8th and the 4th of the month, so you could write a beast mode like this:…
-
It's a little unclear of what you are trying to do, but if you change 1 following to 0 following, that should eliminate the total in the first row. If you are really want to previous values and next values then consider using LAG for previous and LEAD for following and then you could total them together in the next tile.
-
I would look at this post by @GrantSmith who lays out this functionality nicely.
