11-18-2020 02:36 PM
Using snowflake bulk load to load a big table. Got the error after extracting 43+ Millions rows from source and not inserted any rows in the target table.
I have changed Buffer size from 10MB to 100MB but no luck. I am not sure what other settings need to change to run this pipeline.
Here is the detailed error message.
Snap errors: {ruuid=79122f75-f6ed-4af3-867b-012f2c8e03fa, label=Snowflake - Bulk Load, failure=Error streaming data while writing to a temporary file, reason=No space left on device, resolution=Address the reported issue.}
Got the error after extracting 43+ Millions rows from source and not inserted any rows in the target table.
12-08-2020 06:04 AM
@erkonline you got any response/resolution to this? I’m getting the same when doing a Snowflake Bulk Upsert.
12-08-2020 03:09 PM
I raised ticket with Snaplogic support team and it automatically got fixed and they mentioned that hotfix could have fixed the problem.