Comments
-
When new dates come into the data, they aren't collapsed. Ideally they would just be collapsed like you see in the screenshot.
-
-
I felt like I was going crazy not being able to find an option to do this by default.
-
We finally cracked this nut this month. Here is a big thing I was missing - the datasets created by the CLI tool or Workbench can have many, many partitions. The limit is MUCH higher than 1500. You still don't want to get stupid with them, but there is more room. Magic can only process 1500 partitions. So when you bring in…
-
It looks like there is a parquet reader built into the Domo CLI tool.
-
Yep - I found that and upvoted it.
-
@AlexF - We have gotten in the habit with a lot of datasets of just expecting that at some point something can go wacky and we can get duplicate data. So we think about data coming in and wonder if there is a way for duplicate data to come in (a script runs twice ... whatever) and build that into our dataflows. Maybe we…
-
Right on!
-
Absolutely - we have so many cards that say "Last 8 days" so we can really get the last 7 REAL days. And this is applicable on the week, month level. I often don't want the current week or month as they are still in progress.
-
@jaeW_at_Onyx 's video saved my butt because I've gone DEEP down the rabbit hole with Magic v2 and ran face first into needing a BETWEEN JOIN. That trick will absolutely work for me .... for now. I add 48 rows every day. So, a year from now ... sum=[n*(n+1)]/2 153,483,960 rows :/
-
Well, I wouldn't call that a total bust. I remember looking at this years ago and thinking it could be handy but I had since forgotten about it. Unfortunately, this doesn't have what I need. I can list scheduled reports (for my user), I can enable and disable. But I can't send a report even for myself. I went back to Domo…
-
I'm not as familiar with the Java CLI @jaeW_at_Onyx but I'll check that out tonight. I'm not sleeping so why not?
-
You're in a good spot, actually. I get very frustrated with how Domo handles datetime data because it makes the assumption that the data coming in is UTC. Assuming your instance has the global setting of a timezone defined, when you bring in the data in Workbench you don't have to do an offset - it will treat it like it is…
-
Atanas got me going right with that KB article and those URLs. The other URL worked for me.
-
That link is just dropping me off to a blank page.
-
Support helped me work through this! 1) The error message is just not a good error message. What it is really saying is "We can't get your file". 2) As of right now the Connector inline help says you can use an HTTP connection. This is not the case. HTTPS is supported. That's why it couldn't get my file. 3) It turns out I…
-
Oof. I think I'd rather just use python to parse the data and spit out a CSV. I already used Python to remove the brackets that the connector doesn't like but now it puked on the next bit.
-
Unfortunately that wasn't the case. I ended up backing up the registry and then deleting the WindowLayout and JobGridLayout keys. Once I restarted workbench, my GUI was reset and I was good to go. Once it was reset I could see what you and support were talking about but that wasn't my issue.
-
The heat map idea is great. That worked out really well. Thank you for the suggestion on that. I went with 15 minute blocks and it is some great data viz. On the timestamps, they are all 'local time' which is what I want. Of course in Workbench I flag them all as Central and in Domo I have central time zone selected. I…
-
@magicdust - so far I'm doing it a couple of different ways. 1) Most of my data is coming in from MS SQL so when I run a workbench job that appends data I also write back to a log table I created to let me know which records were uploaded and when. 2) I parse the log files in C:\ProgramData\DomoWorkbench for the jobs…
-
I'll give a heat map a try. The time stamps represent when a sale was made, e.g. 11:15 am. With 37 million records, every minute from open to close is accounted for with at least a few thousand sales. I think to do the time as a heat map I'm going to have to group into 30min chunks. That Pareto is a great idea because…
-
Thank you for these posts. I would love Domo to add a "just don't mess with my timestamps, ok?" button. We have restaurants across the country and the POS records everything in local time. As far as I'm concerned, local time is what I want. If I want to see how busy we are between 11 and 1 I want everyone's local 11 and 1.…
-
As far as I've been able to see, no. To be interactive, you have to be logged into your domo instance. I mean, you could do some iFrame stuff but people would still have to be logged in for that to work. You can do the publish as slide show thing and you can embed that but you don't really get the interactivity part of…
-
I've suggested we get a Domo connector for Domo so I can get data on my data. If they call it the Yo Dawg connector it will just make my day. For monitoring I use a combination of things, most importantly I've setup alerts on all my dataflows. That is critical for monitoring for me. (I guess I could also setup a rule to…
-
In TSQL you could use CASE on the column to search for the strings Pushing or Pulling and return just that string. But I'm not sure if you can use that with ODBC in WB3 or if you are even pulling from a TSQL table. Sorry if that isn't much help ?
-
This is fantastic. I am trying this for the first time today and I'm using a field ImportTime for my lastvalue. I setup the initial variable using Edit Query Variables. ImportTime and a timestamp that would be before all my data. I then added a line to my query WHERE cs.[ImportTime] > '!{lastvalue:ImportTime}!' ORDER BY…
-
If you can get rid of the % out of the value you then you can cast it as a number. Then take that number and just divide by 100. There is your decimal value. So what about (going step by step) TRIM(TRAILING '%' FROM 'your_column') = 14 CAST((TRIM(TRAILING '%' FROM 'your_column')) AS DECIMAL(2,2)) = 14.00 Finally --…