Comments
-
No, but we did need to put in a request to our Domo support person to get it to show up for us. It may be permissions configured within her role on the Dayforce side, but the person whose credentials we tried is a Dayforce Admin, so not sure. She has reached out to them.
-
I just wanted to follow up on this incase anyone else has a related issue. This was not a bug. The problem was that I was prefacing the columns with "zip2fip." I followed the advice below it it resolved my issue. "I will provide some clarification. The US map is unique in that it really could be one of several different…
-
I went ahead and tried all the zip2fips columns in both United States map card and US State map card and still had no luck. Just to test, I went into the ETL and removed the column name change and reverted them to their natural name. Then they started working properly. So that is odd and I may need to submit a help desk…
-
Good feedback. The reason I had tried to avoid the alert method is it is natural for pieces to not be updated in the database for 3-5 hours at times of the day. But I suppose I can set the alert to go off when a piece hasn't been updated in 12 hours. Ideally I'd like to catch the fail ASAP. But this will at least allow me…
-
Thanks, I was definitely over-thinking this. I for some reason thought that way wouldn't work when I started this, but it did. Chalk it up to an airhead moment.
-
After I reloaded it, it changed to say nothing would be affected. So thats good.
-
-
FYI, for "How to Use MetaData to Clean and Instance", I copied the code for my ETL. I'm not sure if maybe I am misunderstanding, or if maybe this is an accident, but the very bottom right filter (Find DS to remove) was set to <= 180 days and not >=180 days. Also, this was very helpful, I had tried building my own versions…
-
Nevermind, I just figured out how to copy and paste the code, sorry about that
-
Hi there, In the dataflow presented by John Stevens, I just had a question about the joins following the Group Bys in the dataflow details and Datasets portion of the ETL (circled in my screen shot) From "dataflow details", we group by dataset output id and count their occurances. When we join that group by back to…
-
Hi there, I lost my notes. I'm trying to find the link to the DomoStats/Governance dataflow that was presented when discussing cleaning up your company Domo instance. I believe he had said he'd posted in the dojo information on the ETL he had created
-
Yes, that is one thing I have noticed is the time it takes to run continues to increase. It used to be less than an hour, and now it takes over 13 hours, and its not even a year old. One thing I was thinking was instead of the way I currently have it, which sounds like an attempt at a recursive dataflow, is if, like Grant…
-
Okay, I ended up finding a way to make it work. It might be similar to what jaeW was explaining to me in his first comment but I just wasn't quite grasping the whole thing In ETL, I took a branch off and grouped by Date, Staff, Call Hours. Then another branch off of Date, Staff, Description. Then I joined these two…
-
Okay that makes sense. it works when I don't use the series because I'm not excluding the denominator from the single whole, but once I put them into buckets, the denominator becomes separated from the numerator, or something like that, i think?
-
Good morning So now I'm having an issue with one of my beastmodes that I cannot quite figure out, and looking at the data table (SS2) seems like it should be working. The issue comes into play when I add a series. The concept of the beastmode is taking the distinct count of HC_donor_call_outcomes.id and dividing it by the…
-
Thanks everyone for the input. In the end I decided to go with the append Mark suggested because, like jaeW says, I like the visibility, especially for future trouble shooting. I ended up un-aggregating all of the other tables as well, with the exception of login hours per day, and used "staff" and "date" as a common…
-
Thanks guys, I'll try it out
-
Thanks, this was helpful. I did try_cast [column] as float between 9.5 and 11.99 and it is working now First i tried to do it as numeric but for some reason it would include 12.0-12.2 in the return
-
Thanks both for the feedback I tried Grant's first just because he responded first and it was more similar to what I had already started. Grant, I was able to get it where I needed based on what you explained. The only thing I did differently was, after the rank and window, instead of the group by, I pivoted back and made…
-
Sorry for the delay, I usually get email notifications when someone responds to my Dojo question, but I didn't this time, so I assumed no one had responded yet. I will work with this info and follow up thanks
-
Nevermind, I was able to figure this out.
-
Okay, thanks for the feedback!
-
I see that this post is from 2017 so maybe this was not an existing feature at that time or maybe you have since figured it out But in the chart properties within the anaylzer, under the data label settings pleat, if you select a data label setting value, it will then give you the option to pick a text color, and white is…
-
sorry, I provided the wrong beastmode. count is used in replace of rank, the first word in my beastmode above
-
Thanks for your suggestion, this worked. It is odd to me that, even though this beastmode kept it as a text column instead of a number column, I was still able to put it in my Y axis. When I put my original beastmode (time dif of stop and start time) into my Y axis, I could only get a count. When I changed it to no…
-
Thank you this worked perfectly.
-
Not sure why only 1 of my attachments posted
-
For the portion count (distinct case when 'successful/unsuccessful' = 'successful' then 'draw id' end), it was giving the NULLs a count as well. So I modified your idea a bit to: case when (case when 'successful/unsuccessful' = 'successful' then 'draw id' end) is not Null then Count(Distinct case when…