cancel
Showing results for 
Search instead for 
Did you mean: 

Getting errors move large data to Snowflake

oggday
New Contributor

Hello,
We are trying to move large amount of data from MS SQL to Snowflake using external staging, and getting following error.

Too many S3 upload parts, maximum number of parts is 10000
Resolution:
Increase the buffer size so that the file size divided by the buffer size is smaller than 10000.

Reason:
The file is too large for the buffer size 10000000

image

The source table contains 1 billion records. We are using external AWS S3 as staging area.

I checked both Snowflake account and Snowflake bulk snap, but cannot find configuration related with buffer size.
It will be appreciated if anyone can share the solution for this.

Thanks!

1 REPLY 1

oggday
New Contributor

I found this article in S3 website that multi part uploader (believe snaplogic internally uses) has limit of 10,000 objects.

Is there way I can increase the object size in snaplogic?