07-10-2018 02:50 PM
We’re using Oracle Insert Snap with an Oracle Thin Account. We have the Fetch Size and Batch Size properties of the account set to 5000. We’ve been successfully using the account with this configuration to load millions of rows to different target tables. Suddenly we are sporadically getting the following error. Here’s the error view (minus the original document). If we attempt to load the failed document using the same pipeline it will insert successfully, so it doesn’t appear that there’s an issue with the data. Has anybody else encountered this or know why this error would be produced?
{
“error”:“SQL operation failed”,
“stacktrace”:“java.sql.BatchUpdateException: ORA-01438: value larger than specified precision allowed for this column\n\n\tat oracle.jdbc.driver.OraclePreparedStatement.executeBatch(OraclePreparedStatement.java:10500)\n\tat oracle.jdbc.driver.OracleStatementWrapper.executeBatch(OracleStatementWrapper.java:230)\n\tat com.zaxxer.hikari.pool.ProxyStatement.executeBatch(ProxyStatement.java:128)\n\tat com.zaxxer.hikari.pool.HikariProxyPreparedStatement.executeBatch(HikariProxyPreparedStatement.java)\n\tat org.jooq.tools.jdbc.DefaultStatement.executeBatch(DefaultStatement.java:93)\n\tat org.jooq.impl.BatchSingle.executePrepared(BatchSingle.java:239)\n\tat org.jooq.impl.BatchSingle.execute(BatchSingle.java:182)\n\tat com.snaplogic.snap.api.sql.operations.JdbcOperationsImpl.flushBatch(JdbcOperationsImpl.java:1285)\n\tat com.snaplogic.snap.api.sql.operations.JdbcOperationsImpl.insert(JdbcOperationsImpl.java:724)\n\tat com.snaplogic.snaps.sql.SimpleSqlInsertSnap.processData(SimpleSqlInsertSnap.java:131)\n\tat com.snaplogic.snaps.sql.SimpleSqlWriteSnap.processDocument(SimpleSqlWriteSnap.java:100)\n\tat com.snaplogic.snaps.sql.SimpleSqlSnap.process(SimpleSqlSnap.java:256)\n\tat com.snaplogic.snap.api.ExecutionUtil.process(ExecutionUtil.java:93)\n\tat com.snaplogic.snap.api.ExecutionUtil.execute(ExecutionUtil.java:105)\n\tat com.snaplogic.snap.api.ExecutionUtil.execute(ExecutionUtil.java:73)\n\tat com.snaplogic.snap.api.SimpleSnap.execute(SimpleSnap.java:67)\n\tat com.snaplogic.cc.snap.common.SnapRunnableImpl.executeSnap(SnapRunnableImpl.java:718)\n\tat com.snaplogic.cc.snap.common.SnapRunnableImpl.execute(SnapRunnableImpl.java:493)\n\tat com.snaplogic.cc.snap.common.SnapRunnableImpl.doRun(SnapRunnableImpl.java:779)\n\tat com.snaplogic.cc.snap.common.SnapRunnableImpl.access$000(SnapRunnableImpl.java:109)\n\tat com.snaplogic.cc.snap.common.SnapRunnableImpl$1.run(SnapRunnableImpl.java:336)\n\tat com.snaplogic.cc.snap.common.SnapRunnableImpl$1.run(SnapRunnableImpl.java:332)\n\tat java.security.AccessController.doPrivileged(Native Method)\n\tat javax.security.auth.Subject.doAs(Subject.java:422)\n\tat org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)\n\tat com.snaplogic.cc.snap.common.SnapRunnableImpl.call(SnapRunnableImpl.java:331)\n\tat com.snaplogic.cc.snap.common.SnapRunnableImpl.call(SnapRunnableImpl.java:109)\n\tat java.util.concurrent.FutureTask.run(FutureTask.java:266)\n\tat java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)\n\tat java.util.concurrent.FutureTask.run(FutureTask.java:266)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)\n\tat java.lang.Thread.run(Thread.java:748)\n”,
“reason”:“ORA-01438: value larger than specified precision allowed for this column\n, error code: 1438, SQL state: 22003”,
“resolution”:“Please check for valid Snap properties and input data.”,
“status_code”:“error”,
“status”:-2,
“snap_details”:{
“label”:“ABC”,
“instance_id”:“”,
“class_id”:“com-snaplogic-snaps-oracle-insert”,
“build_tag”:“snapsmrc486”,
“views”:{
“in”:{
“inputRows”:{
“count”:5500
}
},
“out”:{
“output101”:{
“count”:5000
}
},
“error”:{
“error0”:{
“count”:0
}
}
}
},
“original”:{
}
}
07-10-2018 10:32 PM
As error specified- value larger than specified precision allowed for this column, so either change the column size limit on oracle table or use substring to make max limit on snap.
07-12-2018 04:03 PM
If there’s one document in the batch that’s going to cause the SQL operation (insert) to fail then none of the documents in the batch will be inserted. This makes sense. A document will be written to the error view for every document in the failed batch. this also makes sense. What we’ve seen is that the first error encountered in the batch operation is associated with every document written to the error view for the batch; regardless of whether the error applies to that specific document or not. Does anyone know if this behavior is coming from the JDBC driver or from SnapLogic? I understand the need to have an error that indicates why the document failed in the batch operation, but it would be helpful if it were possible to associate the database error only with the document that caused the failure and associate a “batch SQL operation failed” error on the other documents. This would make it easier to identify the document that produced the error.
10-29-2018 10:57 AM
Hello PSAmmirata
The same issue I am facing right now. my database account batch size is set to 50 and if error is in single record entire batch is getting error out with error message of first record where as other 49 records are valid records without erroneous data (tried and tested by manually removing 1 erroneous record ).
By setting batch size to 1 resolved the issue but now pipeline is taking 20 times more time as earlier ☹️
Kindly suggest me you came across any other better solution that would help me a lot.
12-06-2019 10:07 AM
Just adding to the chorus here. I am having the same issue. I am working to replace sqlldr, so, I am able to compare the same import between 2 methods. Snaplogic loads fine if the data is perfect. If there is any issue, in my case foreign key exceptions. It errors out a lot of good records with the same error code. The error clearly does not apply to a majority of the records (the parent does exist). The rejects do include the records that are appropriately rejected.