Comments
-
@MichelleH yup on the same machine and it took a little under 3 minutes right now
-
Hey @MichelleH we are using the SQLOLEDB (64-bit) provider for this. When I query the exact same query in SSMS it takes about 2 mins to execute the 1.4mil rows.
-
@david_cunningham Thanks for the response. I am aware of the whole 3rd day thing but my question is more towards would the charge be on the 1mil or the 2mil. Lets assume that on this day it is the 3rd lowest usage date, and this dataset went from 1mil → 2mil → 1mil rows. Would I be charged 1 credit? 2 credits? 4 credits?…
-
@AdamC To have the syntax appear for the "rename", you need to at least rename one of them before copying.
-
Alright well they replied super quick. Apparently you need to clear your cache and once refreshed it should all come back. I just tried it and it worked.
-
I went ahead and submitted a support ticket, I will reply here once I get an answer. Thank you for the help @jessdoe
-
Thank you for the reply. I immediately thought it was that as I have seen that for others, but I scrolled all the way to the bottom, even zoomed out as much as I could to make sure and that message does not appear on the screen.
-
Sorry to revive this thread but just wanted to ask if there was a way to set a GLOBAL on failure notification? Rather than having to check off the box for the specific email on each individual dataset.
-
@Lewis We have identified that the major issue that is causing this is if you have the timezone conversion activated on the workbench job. We removed it, ran a replace of the whole dataset, then re-added it back in and it seemed to have fixed itself.... for now. We have been running into quite a few issues regarding…
-
Alright I come back with an update on my previous comment. Their solution did not work for me ... but I have fixed my issue. I had a group schedule with 18 jobs in it that all run in parallel. I figured maybe the load was too much on just one single group schedule. So what I did was I split it into 2 group schedules that…
-
We reached out to domo support and they responded with: Hi, we have a fix for the scheduling issue in workbench. It can be fixed by clearing the schedule (selecting manual run), saving, then setting a schedule again. There was an update to our CRON library for Daylight Savings Time . Please follow the instructions above to…
-
Adding a comment here to tag along with this issue as I am also getting the same issue as of maybe a week ago this started happening. Before it was running just fine.
-
Hey so quick question on this. I get the CLI command to run this but I was wondering, what about running a group schedule via this method? Does a group schedule itself have a job id which I can reference for this? Or would I have to just call this command on each of the jobs within the schedule?
-
Ohhh snap ok I will investigate to see if they always have a set number thanks for this! no idea we had that limit rows ability
-
Ahhh gotcha, I was seeing if there was a way around the additional layer of ETL but I guess i will stick with this. Thank you!
-
@jaeW_at_Onyx Thanks for the help. I was able to get my admin to enable direct sign-on for me and that solved the issue and everything is working fine. Love the channel and videos by the way, SUPER HELPFUL!!
-
@Fatias @jaeW_at_Onyx I am trying to implement this as well and we also have SSO enabled, how would i get the password needed for the body
-
It is the querying. The portion before the process monitor says: "[07.14.22 2:57:53 PM] Finished reading XXX,XXX data rows." and so on There is no join as far as I know, and when I removed the column I simply excluded it from the query itself.
-
I did read up on that but what if workbench is installed on one server and the sql jobs run on a different server? is that possible?
-
Wow... just tested this on a small scale (due to there being 700+ columns browser keeps freezing on me anytime I open up that tile lol) and it worked like a charm, thank you... I will comment on here again if I end up needing more help lol @MarkSnodgrass
-
For the first one wouldn't that still require me to bring in all the data into DOMO then apply the recursion in the ETL? Which at that point I might as well just do a replace of all the data at once. For the second one, the thing is I am not keeping history in these sources. Therefore DOMO will have more data than what I…
-
No I don't even see the Upsert column in the job configuration in the schema tab to be able to select my key. In the documentation it states that if it is not there to contact CS but my concern is will I need to do this every single time I create a new dataset
-
Even if both workbenches are logged in with the same account?
-
Well let me ask this then. If I create a job (lets say job id 123) in a server where workbench is installed and everything is good, then I log into my local machine (ie. laptop or desktop at work) and I install workbench on that machine, would I be able to see job id 123 on this workbench? (along with the other jobs that…