cancel
Showing results for 
Search instead for 
Did you mean: 

Create Excel files larger than 100MB

rustin
New Contributor

I need to create an Excel file which is larger than 100MB in file writer. I am receiving an error message in File Writer: 

Possible reasons can be failure in URL connection or file access denial, detail: SLDB does not support data larger than 100 MB
I am copying the whole message below. Is there a way to workaround this?
 
 
Failed to write to gdrive.xlsx

Resolution:
Address the reported issue.

Reason:
Possible reasons can be failure in URL connection or file access denial, detail: SLDB does not support data larger than 100 MB
Hide Details...
File Writer[570b47e3a415f5440eb1e2dc_43c3cbfb-6c35-4be5-98d4-9dbe3c917864 -- c6bcdbd4-c8aa-4b4f-bb4f-19441389290f]
com.snaplogic.snap.api.SnapDataException: Failed to write to gdrive.xlsx at com.snaplogic.snaps.binary.AbstractWriter.throwExceptionCantWrite(AbstractWriter.java:739) at com.snaplogic.snaps.binary.AbstractWriter.writeData(AbstractWriter.java:595) at com.snaplogic.snaps.binary.AbstractWriter.process(AbstractWriter.java:379) at com.snaplogic.snaps.binary.AbstractWriter.doWork(AbstractWriter.java:329) at com.snaplogic.snap.api.SimpleBinarySnap.execute(SimpleBinarySnap.java:57) at com.snaplogic.cc.snap.common.SnapRunnableImpl.executeSnap(SnapRunnableImpl.java:812) at com.snaplogic.cc.snap.common.SnapRunnableImpl.execute(SnapRunnableImpl.java:586) at com.snaplogic.cc.snap.common.SnapRunnableImpl.doRun(SnapRunnableImpl.java:877) at com.snaplogic.cc.snap.common.SnapRunnableImpl.call(SnapRunnableImpl.java:436) at com.snaplogic.cc.snap.common.SnapRunnableImpl.call(SnapRunnableImpl.java:120) at java.base/java.util.concurrent.FutureTask.run(Unknown Source) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) at java.base/java.util.concurrent.FutureTask.run(Unknown Source) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) at java.base/java.lang.Thread.run(Unknown Source) Caused by: com.snaplogic.api.ExecutionException: SLDB does not support data larger than 100 MB at com.snaplogic.common.url.protocol.sldb.SldbOutputStream.withinLimits(SldbOutputStream.java:230) at com.snaplogic.common.url.protocol.sldb.SldbOutputStream.write(SldbOutputStream.java:120) at java.base/java.io.BufferedOutputStream.write(Unknown Source) at com.snaplogic.snaps.binary.AbstractWriter.copy(AbstractWriter.java:522) at com.snaplogic.snaps.binary.AbstractWriter.writeData(AbstractWriter.java:582) ... 14 more Reason: Possible reasons can be failure in URL connection or file access denial, detail: SLDB does not support data larger than 100 MB Resolution: Address the reported issue. Error Fingerprint[0] = efp:com.snaplogic.snaps.binary.BcA2Uzfu Error Fingerprint[1] = efp:com.snaplogic.common.url.protocol.sldb.WQwFN6-j
 
12 REPLIES 12

@koryknick - I checked with the admins and the temp space allocate min 4 GB to Max 8 GB - so this should be sufficient. 
Another suggested solution was to skip creating excel file in Snaplogic, but connect HTTP client straight after Snowflake select, but this is doing thousands of Excel files in the destination folder, each file containing 50 rows of data. Is there something more we can do to make it work? Thanks.

rustin_0-1707214153573.png

 

koryknick
Employee
Employee

@rustin - There must be another issue causing the error writing to pipe.tmpDir - I have verified in my environment that a file that fails to write to SLDB due to file size over 100MB is successfully written to my pipe.tmpDir

koryknick_0-1706279909314.png

koryknick_1-1706279936054.png

Can you share the error you see when using pipe.tmpDir?  Is it an error in the File Writer or Excel Formatter?

 

 

koryknick
Employee
Employee

@rustin - what is the current error?

@koryknick, the error is in the File Writer, copying the errors in files attached. 

koryknick
Employee
Employee

@rustin - it looks like you're still trying to write to the SLDB which does not support files over 100MB.  Can you target a different endpoint to write your file, such as SFTP or S3?