PostgreSQL JDBC Connection Error

I’m getting an error at the end of a pipeline I have. The error says that it’s failing to run an insert statement but if I manually run the same insert statement it runs fine. I have a feeling that it is due to a timeout limit somewhere. Has anyone seen this before?

> SQL operation failed
> 
> Please check for valid Snap properties and input data.
> Reason:
> An I/O error occurred while sending to the backend., error code: 0, SQL state: 08006 connection failure
> 
> new_coverage_insert[5e203b525de8f07c592265c6_ad5b142c-48bc-4018-a036-ed2291ae8bc1 -- 6bb90d78-bb6e-4c00-b1d9-d932706b37ec]
> `com.snaplogic.cc.snap.common.ThreadDetails: Id=237567 TIMED_WAITING on com.snaplogic.common.url.protocol.plexfs.PlexFsUrlConnection$InterruptablePipedInputStream@438fc771
> 	at java.lang.Object.wait(Native Method)
> 	-  waiting on com.snaplogic.common.url.protocol.plexfs.PlexFsUrlConnection$InterruptablePipedInputStream@438fc771
> 	at java.io.PipedInputStream.awaitSpace(PipedInputStream.java:273)
> 	at java.io.PipedInputStream.receive(PipedInputStream.java:231)
> 	at java.io.PipedOutputStream.write(PipedOutputStream.java:149)
> 	at com.snaplogic.common.url.protocol.plexfs.PlexFsUrlConnection$PlexFsPipedOutputStream.write(PlexFsUrlConnection.java:567)
> 	at org.apache.commons.io.output.ProxyOutputStream.write(ProxyOutputStream.java:90)
> 	at com.fasterxml.jackson.core.json.UTF8JsonGenerator._flushBuffer(UTF8JsonGenerator.java:2003)
> 	at com.fasterxml.jackson.core.json.UTF8JsonGenerator.close(UTF8JsonGenerator.java:1049)
> 	...
> 	at java.lang.Object.wait(Native Method)
> 	at java.io.PipedInputStream.awaitSpace(PipedInputStream.java:273)
> 	at java.io.PipedInputStream.receive(PipedInputStream.java:231)
> 	at java.io.PipedOutputStream.write(PipedOutputStream.java:149)
> 	at com.snaplogic.common.url.protocol.plexfs.PlexFsUrlConnection$PlexFsPipedOutputStream.write(PlexFsUrlConnection.java:567)
> 	at org.apache.commons.io.output.ProxyOutputStream.write(ProxyOutputStream.java:90)
> 	at com.fasterxml.jackson.core.json.UTF8JsonGenerator._flushBuffer(UTF8JsonGenerator.java:2003)
> 	at com.fasterxml.jackson.core.json.UTF8JsonGenerator.close(UTF8JsonGenerator.java:1049)
> 	at com.fasterxml.jackson.databind.ObjectMapper._configAndWriteValue(ObjectMapper.java:3561)
> 	at com.fasterxml.jackson.databind.ObjectMapper.writeValue(ObjectMapper.java:2893)
> 	at com.snaplogic.document.parser.DocumentStreamWriter.write(DocumentStreamWriter.java:52)
> 	at com.snaplogic.cc.snap.view.document.DocumentPreviewGenerator.openNewFileAndWrite(DocumentPreviewGenerator.java:121)
> 	at com.snaplogic.cc.snap.view.document.DocumentPreviewGenerator.writeToStorage(DocumentPreviewGenerator.java:99)
> 	at com.snaplogic.cc.snap.view.document.DocumentPreviewGenerator.writePreview(DocumentPreviewGenerator.java:60)
> 	at com.snaplogic.cc.snap.view.document.DocumentOutputViewImpl.lambda$write$0(DocumentOutputViewImpl.java:176)
> 	at com.snaplogic.cc.snap.view.document.DocumentOutputViewImpl$Lambda$291/346687012.call(Unknown Source)
> 	at com.snaplogic.cc.snap.common.ViewImpl.timeCall(ViewImpl.java:658)
> 	at com.snaplogic.cc.snap.view.document.DocumentOutputViewImpl.write(DocumentOutputViewImpl.java:162)
> 	at com.snaplogic.snap.api.OutBoundViewsImpl.write(OutBoundViewsImpl.java:221)
> 	at com.snaplogic.snap.api.ErrorViewsImpl.write(ErrorViewsImpl.java:193)
> 	at com.snaplogic.snap.api.sql.DatabaseUtils.writeErrorView(DatabaseUtils.java:716)
> 	at com.snaplogic.snap.api.sql.DatabaseUtils.handleUpdateResult(DatabaseUtils.java:625)
> 	at com.snaplogic.snap.api.sql.DatabaseUtils.handleUpdateResult(DatabaseUtils.java:520)
> 	at com.snaplogic.snaps.sql.SimpleSqlWriteSnap.handle(SimpleSqlWriteSnap.java:147)
> 	at com.snaplogic.cc.snap.common.SnapRunnableImpl.callback(SnapRunnableImpl.java:805)
> 	at com.snaplogic.cc.snap.common.SnapRunnableImpl.execute(SnapRunnableImpl.java:551)
> 	at com.snaplogic.cc.snap.common.SnapRunnableImpl.doRun(SnapRunnableImpl.java:834)
> 	at com.snaplogic.cc.snap.common.SnapRunnableImpl.call(SnapRunnableImpl.java:400)
> 	at com.snaplogic.cc.snap.common.SnapRunnableImpl.call(SnapRunnableImpl.java:116)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> 	at java.lang.Thread.run(Thread.java:748)
> 
> Error Fingerprint[0] = efp:java.lang.48BpFI_N`
> 
> new_coverage_insert[5e203b525de8f07c592265c6_ad5b142c-48bc-4018-a036-ed2291ae8bc1 -- 6bb90d78-bb6e-4c00-b1d9-d932706b37ec]
> `com.snaplogic.snap.api.SnapDataException: SQL operation failed
> 	at com.snaplogic.snap.api.sql.DatabaseUtils.writeErrorView(DatabaseUtils.java:697)
> 	at com.snaplogic.snap.api.sql.DatabaseUtils.handleUpdateResult(DatabaseUtils.java:625)
> 	at com.snaplogic.snap.api.sql.DatabaseUtils.handleUpdateResult(DatabaseUtils.java:520)
> 	at com.snaplogic.snaps.sql.SimpleSqlWriteSnap.handle(SimpleSqlWriteSnap.java:147)
> 	at com.snaplogic.cc.snap.common.SnapRunnableImpl.callback(SnapRunnableImpl.java:805)
> 	at com.snaplogic.cc.snap.common.SnapRunnableImpl.execute(SnapRunnableImpl.java:551)
> 	at com.snaplogic.cc.snap.common.SnapRunnableImpl.doRun(SnapRunnableImpl.java:834)
> 	at com.snaplogic.cc.snap.common.SnapRunnableImpl.call(SnapRunnableImpl.java:400)
> 	at com.snaplogic.cc.snap.common.SnapRunnableImpl.call(SnapRunnableImpl.java:116)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> 	at java.lang.Thread.run(Thread.java:748)
> Caused by: java.sql.BatchUpdateException: Batch entry 0 insert into "query" was aborted.  Call getNextException to see the cause.
> 	at org.postgresql.jdbc.PgStatement$BatchResultHandler.handleError(PgStatement.java:2356)
> 	at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:345)
> 	at org.postgresql.jdbc.PgStatement.executeBatch(PgStatement.java:2534)
> 	at com.zaxxer.hikari.pool.ProxyStatement.executeBatch(ProxyStatement.java:128)
> 	at com.zaxxer.hikari.pool.HikariProxyPreparedStatement.executeBatch(HikariProxyPreparedStatement.java)
> 	at org.jooq.tools.jdbc.DefaultStatement.executeBatch(DefaultStatement.java:93)
> 	at org.jooq.impl.BatchSingle.executePrepared(BatchSingle.java:239)
> 	at org.jooq.impl.BatchSingle.execute(BatchSingle.java:182)
> 	at com.snaplogic.snap.api.sql.operations.JdbcOperationsImpl.lambda$flushBatch$8(JdbcOperationsImpl.java:1602)
> 	at net.jodah.failsafe.Functions$11.call(Functions.java:263)
> 	at net.jodah.failsafe.SyncFailsafe.call(SyncFailsafe.java:145)
> 	at net.jodah.failsafe.SyncFailsafe.get(SyncFailsafe.java:69)
> 	at com.snaplogic.snap.api.sql.operations.JdbcOperationsImpl.flushBatch(JdbcOperationsImpl.java:1599)
> 	at com.snaplogic.snaps.sql.SimpleSqlWriteSnap.handle(SimpleSqlWriteSnap.java:136)
> 	... 11 more
> Reason: An I/O error occurred while sending to the backend., error code: 0, SQL state: 08006 connection
>     failure
> Resolution: Please check for valid Snap properties and input data.

I’ll reply to my own post. This error was due to timeout between SnapLogic cloudplex and on our AWS PostgreSQL instance. Adding Sort snaps in front of all my insert and update SQL snaps resolved the issue for me. I don’t quite fully understand the reasoning, but works.

Hi Davis,

I didn’t that issue but I was limited to send script in execute to only 50… something index out of bounds…

The SQL script that I was generating is upsert using the ON CONFLICT DO UPDATE… but the table have so… so… many columns, I was wondering were you able to do a bulk load using csv so that no need to declare the columns… or do you have a template that will set the columns during runtime… what is the format of the schema table ?? is it like declaring a variable in SQL? I’m kinda newbie with postgres

image

I have this topic opened any suggestion/insights?

Thanks,
EmEm

I haven’t used the bulk load feature at all. When I stage my data I just use a mapper snap and then an insert. I suppose you could dynamically set all the column names, but it isn’t something I’ve actually done in SnapLogic.