WB4: "Input String not in correct format"?
Does anybody have any ideas about this error?
I'm running a WB4 job connecting to xlsx file. I've manually run this job 14 times successfully. I've tried to preview my data three or four times this morning, and am always met with the error "Input String is not in the correct format." Looking at the log for the job, I see the following action (pasted below) in the log. It looks like it's reading my file properly, but there's something wrong in the WB4 code?
Any ideas why I can't run this data in the exact same way I've done over a dozen times already (over the course of about 6 weeks)
Thanks,in advance! ~Gus
[ERROR] [9872:4] 2016-01-18 09:54:59,023 - Error executing DataSet Job
System.FormatException: Input string was not in a correct format.
at System.Number.ParseDouble(String value, NumberStyles options, NumberFormatInfo numfmt)
at System.String.System.IConvertible.ToDouble(IFormatProvider provider)
at System.Convert.ChangeType(Object value, Type conversionType, IFormatProvider provider)
at CalcEngine.Expression.op_Implicit(Expression x) in c:\jenkins\workspace\Workbench\WorkbenchBuild\CalcEngine\Expression.cs:line 105
at CalcEngine.BinaryExpression.Evaluate() in c:\jenkins\workspace\Workbench\WorkbenchBuild\CalcEngine\Expression.cs:line 265
at CalcEngine.Expression.op_Implicit(Expression x) in c:\jenkins\workspace\Workbench\WorkbenchBuild\CalcEngine\Expression.cs:line 78
at CalcEngine.BinaryExpression.Evaluate() in c:\jenkins\workspace\Workbench\WorkbenchBuild\CalcEngine\Expression.cs:line 267
at DomoDataTransforms.CalculatedField.CalculatedFieldTransform.<>c__DisplayClass2.<ExecuteDataTransform>b__1() in c:\jenkins\workspace\Workbench\WorkbenchBuild\DataTransforms\DomoDataTransforms\CalculatedField\CalculatedFieldTransform.cs:line 57
at System.Threading.Tasks.Task.Execute()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at WorkbenchSDK.Execution.WorkbenchJobExecutor.<ProcessRows>d__61.MoveNext() in c:\jenkins\workspace\Workbench\WorkbenchBuild\WorkbenchSDK\Execution\WorkbenchJobExecutor.cs:line 507
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at WorkbenchSDK.Execution.WorkbenchJobExecutor.<PreviewJobOutputAsync>d__24.MoveNext() in c:\jenkins\workspace\Workbench\WorkbenchBuild\WorkbenchSDK\Execution\WorkbenchJobExecutor.cs:line 208
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at WorkbenchSDK.Execution.WorkbenchJobExecutor.<PreviewJobOutputAsync>d__6.MoveNext() in c:\jenkins\workspace\Workbench\WorkbenchBuild\WorkbenchSDK\Execution\WorkbenchJobExecutor.cs:line 50
Best Answer
-
Hmmm... For some reason my previous response never posted here in Dojo.
In short, @kshah008, not really... though it did confirm what column I believed was causing the error.
I had a transform in the WB4 job that was calculating the difference between two columns (datetime, in HH:MM format) and returning a decimal value (hours). It's worked perfectly for weeks, but stopped working sometime over the past couple weeks.
I have, however, found a solution. I abandoned the job, and started a new WB4 job that eliminates the transform. I then simply perform the transform once the data gets into the data center, though I use a different method. I'm now turning the datetimes into UNIX_TIMSTAMPs, then subtracting, and multiplying by 3600 to convert the value into hours.
Because the format of the starting values was in HH:MM, there was one additional step. I had to perform a DATEADD(value, INTERVAL 75 YEAR) on both values to make sure they were after the UNIX_TIMESTAMP 0 point, but since all I'm after is a difference between the two, it isn't consequential.
This works fine, but I had to do some additional dataflow work to splice this data into the weeks worth of good data I had previously. There's probably a more elegant solution, but this is the one I pieced together...
0
Answers
-
@gush, the error here happens when Workbench is trying to read an excel file column that is supposed to be of the "DOUBLE" type (i.e. decimal number), but it doesn't quite conform to the right format. Sometimes there may be other characters that are in a cell, or even an error code of some sort that throws off the column value reading (e.g. having any kind of text value like "#N/A" in a cell will probably cause the job to fail).
Your best bet is to open the file manually and try to identify any problematic values in columns that should be numeric decimals. Also feel free to share the file with me through a private message and I'm happy to take a look at it to see what's going on.
0 -
Hmmm... For some reason my previous response never posted here in Dojo.
In short, @kshah008, not really... though it did confirm what column I believed was causing the error.
I had a transform in the WB4 job that was calculating the difference between two columns (datetime, in HH:MM format) and returning a decimal value (hours). It's worked perfectly for weeks, but stopped working sometime over the past couple weeks.
I have, however, found a solution. I abandoned the job, and started a new WB4 job that eliminates the transform. I then simply perform the transform once the data gets into the data center, though I use a different method. I'm now turning the datetimes into UNIX_TIMSTAMPs, then subtracting, and multiplying by 3600 to convert the value into hours.
Because the format of the starting values was in HH:MM, there was one additional step. I had to perform a DATEADD(value, INTERVAL 75 YEAR) on both values to make sure they were after the UNIX_TIMESTAMP 0 point, but since all I'm after is a difference between the two, it isn't consequential.
This works fine, but I had to do some additional dataflow work to splice this data into the weeks worth of good data I had previously. There's probably a more elegant solution, but this is the one I pieced together...
0 -
Glad you worked it out! I'm going to mark your response as the solution.
0
Categories
- All Categories
- 1.8K Product Ideas
- 1.8K Ideas Exchange
- 1.6K Connect
- 1.2K Connectors
- 300 Workbench
- 6 Cloud Amplifier
- 9 Federated
- 2.9K Transform
- 102 SQL DataFlows
- 626 Datasets
- 2.2K Magic ETL
- 3.9K Visualize
- 2.5K Charting
- 753 Beast Mode
- 61 App Studio
- 41 Variables
- 692 Automate
- 177 Apps
- 456 APIs & Domo Developer
- 49 Workflows
- 10 DomoAI
- 38 Predict
- 16 Jupyter Workspaces
- 22 R & Python Tiles
- 398 Distribute
- 115 Domo Everywhere
- 276 Scheduled Reports
- 7 Software Integrations
- 130 Manage
- 127 Governance & Security
- 8 Domo Community Gallery
- 38 Product Releases
- 11 Domo University
- 5.4K Community Forums
- 40 Getting Started
- 30 Community Member Introductions
- 110 Community Announcements
- 4.8K Archive