Forum Discussion
rustin - the SLDB (file space in your SnapLogic project folder) has a limit of 100MB. The path specified by pipe.tmpDir is limited by the amount of temp space allocated to SnapLogic on the execution node, so you would need to check with your server admins in your environment.
However, Excel itself has a maximum record count of 1,048,576 rows and 16,384 columns per sheet. If you need more than that, would CSV be sufficient? The file size would be significantly more as a CSV since Excel compresses the data, but it may be worth trying.
Also note that pipe.tmpDir is transient and exists only during pipeline execution, so as soon as the pipeline completes, everything in that directory is purged.
Hope this helps!
koryknick - I checked with the admins and the temp space allocate min 4 GB to Max 8 GB - so this should be sufficient.
Another suggested solution was to skip creating excel file in Snaplogic, but connect HTTP client straight after Snowflake select, but this is doing thousands of Excel files in the destination folder, each file containing 50 rows of data. Is there something more we can do to make it work? Thanks.