Getting errors move large data to Snowflake

Hello,
We are trying to move large amount of data from MS SQL to Snowflake using external staging, and getting following error.

Too many S3 upload parts, maximum number of parts is 10000
Resolution:
Increase the buffer size so that the file size divided by the buffer size is smaller than 10000.

Reason:
The file is too large for the buffer size 10000000

The source table contains 1 billion records. We are using external AWS S3 as staging area.

I checked both Snowflake account and Snowflake bulk snap, but cannot find configuration related with buffer size.
It will be appreciated if anyone can share the solution for this.

Thanks!

I found this article in S3 website that multi part uploader (believe snaplogic internally uses) has limit of 10,000 objects.
https://docs.aws.amazon.com/AmazonS3/latest/dev/qfacts.html

Is there way I can increase the object size in snaplogic?