Comments
-
I have asked the firewall question to someone with more knowledge than me, but I do know the we have other workbench jobs set up on the same database/server that are connecting and running successfully. This job that is giving us the "attempting to open data source" struggles has done this many times in the past. It is a…
-
Yes, I think that might be the route we take. I didn't think this would be possible at the card level, but I was not sure so I thought I'd reach out to the Dojo. Thanks
-
It would be the distinct count of donation ids grouped by [date range], [acceptable procedures], then the count of distinct donors ids grouped by the above donation id counts. So for example, if I did this in ETL, I would filter donations to the appropriate date range and acceptable donation procedures, I would put that in…
-
If I do that, I get this: I feel like there would maybe need to be some grouping
-
Actually, this would be a better visual example
-
Maybe I need to uninstall and reinstall?
-
Also, after it spins, I get this message "Object reference not set to an instance of an object"
-
I think I found my mistake. After the join, it looks like this So I needed to change my beastmode to this SUM(SUM(lte_donation_general.donation_id_distinctcount_by_hour_by_year_tz) FIXED (BY SP_Fiscal_Calendar.Fiscal_Year, lte_donation_general.collection_date_dayofweek_tz,…
-
@FreddieG You need to set the update method to append, then in the schema, there should be an Upsert Key column. This was not appearing for me originally so I had to contact Domo support and they got it added pretty quickly. I can't remember if it is still in beta or not
-
Hi Rob, I double checked to make sure the formulas match. Today, just to try something and see what happens, within the ETL formula, I removed the single quotes around '24' and ran it. Now that formula is working correctly all the way through. So that's good. This whole thing stems from an "island and gaps" situation. It…
-
I've also been getting errors when selecting several beastmodes at once to archive them. I thought maybe that perhaps some of them were locked, so then I filtered the beastmodes to "not used in a card" and to myself as the owner. I have admin security as well. But it would still say failed to archive. It could just be a…
-
This worked for me
-
No, but we did need to put in a request to our Domo support person to get it to show up for us. It may be permissions configured within her role on the Dayforce side, but the person whose credentials we tried is a Dayforce Admin, so not sure. She has reached out to them.
-
I just wanted to follow up on this incase anyone else has a related issue. This was not a bug. The problem was that I was prefacing the columns with "zip2fip." I followed the advice below it it resolved my issue. "I will provide some clarification. The US map is unique in that it really could be one of several different…
-
I went ahead and tried all the zip2fips columns in both United States map card and US State map card and still had no luck. Just to test, I went into the ETL and removed the column name change and reverted them to their natural name. Then they started working properly. So that is odd and I may need to submit a help desk…
-
Good feedback. The reason I had tried to avoid the alert method is it is natural for pieces to not be updated in the database for 3-5 hours at times of the day. But I suppose I can set the alert to go off when a piece hasn't been updated in 12 hours. Ideally I'd like to catch the fail ASAP. But this will at least allow me…
-
Thanks, I was definitely over-thinking this. I for some reason thought that way wouldn't work when I started this, but it did. Chalk it up to an airhead moment.
-
After I reloaded it, it changed to say nothing would be affected. So thats good.
-
-
FYI, for "How to Use MetaData to Clean and Instance", I copied the code for my ETL. I'm not sure if maybe I am misunderstanding, or if maybe this is an accident, but the very bottom right filter (Find DS to remove) was set to <= 180 days and not >=180 days. Also, this was very helpful, I had tried building my own versions…
-
Nevermind, I just figured out how to copy and paste the code, sorry about that
-
Hi there, In the dataflow presented by John Stevens, I just had a question about the joins following the Group Bys in the dataflow details and Datasets portion of the ETL (circled in my screen shot) From "dataflow details", we group by dataset output id and count their occurances. When we join that group by back to…
-
Hi there, I lost my notes. I'm trying to find the link to the DomoStats/Governance dataflow that was presented when discussing cleaning up your company Domo instance. I believe he had said he'd posted in the dojo information on the ETL he had created
-
Yes, that is one thing I have noticed is the time it takes to run continues to increase. It used to be less than an hour, and now it takes over 13 hours, and its not even a year old. One thing I was thinking was instead of the way I currently have it, which sounds like an attempt at a recursive dataflow, is if, like Grant…
-
Okay, I ended up finding a way to make it work. It might be similar to what jaeW was explaining to me in his first comment but I just wasn't quite grasping the whole thing In ETL, I took a branch off and grouped by Date, Staff, Call Hours. Then another branch off of Date, Staff, Description. Then I joined these two…
-
Okay that makes sense. it works when I don't use the series because I'm not excluding the denominator from the single whole, but once I put them into buckets, the denominator becomes separated from the numerator, or something like that, i think?
-
Good morning So now I'm having an issue with one of my beastmodes that I cannot quite figure out, and looking at the data table (SS2) seems like it should be working. The issue comes into play when I add a series. The concept of the beastmode is taking the distinct count of HC_donor_call_outcomes.id and dividing it by the…
-
Thanks everyone for the input. In the end I decided to go with the append Mark suggested because, like jaeW says, I like the visibility, especially for future trouble shooting. I ended up un-aggregating all of the other tables as well, with the exception of login hours per day, and used "staff" and "date" as a common…
-
Thanks guys, I'll try it out
-
Thanks, this was helpful. I did try_cast [column] as float between 9.5 and 11.99 and it is working now First i tried to do it as numeric but for some reason it would include 12.0-12.2 in the return