-
Transaction Deadlock workbench error
Good morning. I am curious if anyone has had this error before. One of our workbench jobs has been getting it more and more. I don't know if it is a domo workbench issue, or if it is an issue on the database side where we query the workbench job from Error 40001 Microsoft ODBC SQL Server Driver SQL Server Transaction was…
-
Doubt about updating tables and credits consumption
Hi everyone, I've got a question about Domo's Workbench Process, if 10 tables are used in order to generate 3 tables done processes in Workbench, how much credits will it be needed?
-
Scheduled job fails to run when DOMO dataset is deleted
I've set up a job in DOMO workbench to pull data from SQL Server and upload it to DOMO instance. Below are the details: Job Details Transport Type: MS SQL ODBC Data Provider Reader Type: Database Query DOMO Details Dataset Name: test_schedule Dataset Type: Workbench ODBC I'm just using a normal select * from <table> as…
-
Large dataset import in Domo Workbench
Hi everyone, I have a very large dataset (18M rows) that I want to import and it takes too long. The query is already a grouped query so there is no unique ID. I am just trying to figure out if there is a way to import possibly a one time chunk (the data is for 2 years) and then maybe append the data after that. So maybe…
-
Workbench Weird Issues
Hello community, Has anyone encountered the inability to restart/stop the Domo workbench service? Even via command line as administrator on a Windows 2019 Server? Everytime I run the command or try to stop/restart via the GUI in task bar I get a message saying I cannot do it. I checked the workbench logs today and saw…
-
Workbench - SharePoint Plugin
Hi there - Where do I find the sharepoint plugin for workbench? As referenced in the below documentations:
-
What is the best password policy for Encryption on Workbench
We have a requirement to encrypt a few fields while uploading via workbench. What will be a good encryption password policy? Let's say we have ten fields that need to be encrypted, and 30 users need access to them that they can decrypt with a password. do you create 30 different encryption passwords how do you manage these…
-
Workbench dataset stuck in "preparing" Status
Rarely, we will see that there is a scheduled run from the Domo Workbench that is stuck in a "preparing" status on the dataset. In the event that this does occur, there are a few steps to perform that may help with resolving this issue. 1) If the job is still showing a 'Running' Status on Workbench, stop the job from the…
-
Job Ends in Cache Warning
Hi Everyone, I have a job that always ends with a warning. WARNING - Could not clear cache, this will be cleaned the next time the service starts. Does anyone know how to clear out this warning? This particular data set is also duplicating records in the target table. I want to rule out that caching is not causing the…
-
is there a way to run domo workbench 5 on local machine without admin privileges
hi, is there a way to run domo workbench 5 on local machine without admin privileges if there is a way please tell and one more question even after i enter admin password and all when searching for a local files in workbench from my user account i am only seeing admin files but not my excel sheets and all which are present…
-
Domo Workbench executing jobs on a Group Schedule multiple times
Hello everyone, I have multiple Workbench jobs that run on a 4 hour schedule. What I have noticed is that when the jobs run, they run 4 times one after the other like they are trying to catch up to the fact that they did not run in the past 4 hours and now running 4 times they look like they they caught up for an hourly…
-
Remove column from existing schema
Hello! Is it possible to remove a column from an existing dataset using the Domo CLI? In otherwords, can I change the schema? My existing schema has 25 columns. But I only have 23 columns in a data file that I want to upload. This causes an issue when uploading the file using the Domo CLI. So, the last two columns of my…
-
DATETIME field with NULL value
I am running into an issue where when I load up some data via workbench there is a table that has a DATETIME field where the default value is NULL. When workbench is querying that, it absolutely does not like that there is a NULL value in that field. It takes about 20+ hours to run the entire process. I removed the column…
-
How can I fix a workbench job where the Schema is locked and doesn't match the dataset?
Hi, everyone! I'm running into an issue with the workbench and can't seem to find a solution. One of my jobs has the work bench schema messed up and is pulling nonexistent headers. I'm not sure if it is getting this information from a previous upload but it doesn't even match the column names or number of columns. I…
-
Workbench - Source File Does Not Exist
Has anyone found a solution for the "Source File Does Not Exist" error? I am using the 'Excel: On-Premise' processing method and using a network drive for storage. Even when I reconfigure and reselect the file it does not resolve and I get the above reference error. Any ideas? Thanks all!
-
Unable to Schedule Jobs
Hi everyone, I can manually run a job which works, however when I schedule a job, it says that it executes however the job does not run and I don't see it in the UI. When I look at the viewer, I see the following error: 'Could not fix consecutive blocks of CRON schedule' Does anyone know how to resolve?
-
Workbench installation failed
I am trying to modify/repair installation of workbench and receiving "Domo Workbench failed to install, see log for more details. Any thoughts and/ or guidance would be appreciated.
-
How to properly partition in Workbench while allowing older data to come in?
I have a SQL Server dataset of 50M rows. Workbench runs the APPEND Update Method HOURLY. I use the table ID as the UPSERT key and the processing query checks the past 7 days ("Insertion Date") for new records. This updates older data unnecessarily with the same data, but protects against issues with the job not running for…
-
Accessing Dataset
Hello everyone, I am using Domo Workbench, and currently I am trying to run a dataset while using an ODBC connection. When I execute the job, it runs fine, however, when I go to Domo datasets I do not see it there. When I execute from Workbench it is executing as another user (another owner). Is there a way I can still see…
-
Create beast mode dates
Hi everyone, help needed! Here is how my table looks below. What I want to show in Analyzer is the count of accounts of these services over time. My problem is how the dates are shown here. Any advice on what I can do? I thought about creating a calculated field to index dates but not sure how to do it. Any help is…
-
Aggregating Numbers
Hello, I have a table that has both positive and negative numbers in them (in my ETL process). What I would like to do is leave it this way, and in the visualization, allow when using a chart for the calculation to take place. Is this possible? For example: Name | Amount Joe | 10 Mary | -20 This is how the tabular data…
-
Workbench Extremely Slow (Can't Search / Open Jobs)
Hi Everyone, I'm trying to open some jobs to review the queries (and potentially move them to Domo connectors and off WB) but I am stuck in a wheel of loading and there seems to be way no way out.. Is there anything I can do to see the jobs information without using WB? I believe this is due to our hardware being slow on…
-
How to periodically REPLACE in addition to APPEND jobs?
I have a VERY large dataset that I've uploaded to Domo. I initially did a REPLACE to load the whole dataset. I then changed the workbench job to APPEND and use an 'update date' in the table to run HOURLY while checking back the past couple days for any new/missed data. Due to some bugs on our side, sometimes very old data…
-
How to append based on new IDs only - not upsert
Is there a way to append only NEW IDs in an upload job? I understand that I can do a "Replace" job for an initial load of a table, and then a regularly scheduled "Append" job using an UPSERT key and a smaller subset of data, such as data with a change date in the past 7 days per the Domo example here:…
-
Trigger a workbench job from a sql script
We have a few scripts that run overnight, some run weekly, some daily, and some hourly. Is there a way to allow our workbench jobs to fire off whenever that script is done running? Or would we just have to pad it by a few hours like say it runs every Friday at 11pm, Have the workbench job to run every Saturday at like 6am…
-
Is there a dataset that shows how long a Workbench Job takes to run?
I would like to create a card that shows which Workbench Jobs are taking the longest to run so I can review and try and improve their overall performance. Ultimately, I want to get a list of Workbench Jobs that are taking longer than 10 minutes to complete. Regards, Jack
-
Workbench Restarts when "Source File Doesn't Exist"
Hello! I'm running into an issue where I have workbench on a server with ... "issues." Sometimes I get "Source file does not exist" (the file absolutely exists) and then that workbench job doesn't run again until the job is manually ran. Any way to get this job to restart even if it thinks the source file doesn't exist? Am…
-
Trim 2 characters from right side of string using Workbench
I have a dataset that appends every day and I want to remove 2 characters (.0) from the right side using Workbench. I thought this would be simple, but Search & Replace does not allow replacing with null, and replacing with a space or other character is also undesirable. Is there a way to use any of the Workbench…
-
Workbench Data Sync Folders won't stay deleted
As I was first connecting files using Domo Workbench I added a number of folders to Data Sync. I now wish to remove these folders as I there are many files in them that I do not need or want syncing with Domo. I can not delete the folders themselves as the files are used by many people in my organization. I have gone…
-
Dataset created through API is endlessly 'Storing'
Hello, I have been trying out the Streaming API. I successfully created and loaded data into a dataset using a small number of records. I re-ran the process multiple times, and because the stream is set to 'append', each time the number of records increased by the correct number. However, when I ran the process uploading…