Option for Preserve temp file,Temp file name and Delete files upon exit in S3 while using Redshift Bulk Load Snap

It will be great to have “Preserve temp file” and “Temp file name” option in Redshift Bulk load snap

We have this option in “Google BigQuery Bulk Load” and its more useful while pipeline failure. It perserve the file so that we can use it in future.

Like the same way, we need option for Redshift Bulk Load Snap.

  • Preserve temp file
  • Temp file name
  • Delete files upon exit
1 Like

Thanks for the request. Can you please help me understand the reason behind this enhancement? Is it more performance related where it has to copy the file again?

As for example, I need to load million/billions of data from Oracle/SQL Server database tables into Redshift through S3.

I have loaded all the data suppose if COPY commands failes in Redshift Bulk load Snap. In this case I need to again read the millions/billions of data from Oracle/SQL Server database tables.

By default there is no preserve with proper customize name method in Redshift BulkLoad Snap. Files not available in S3 bucket.

And also if S3 has files then I can simply do some changes in S3 itself instead of hitting database again and again. Also If we preserve the files with proper name we may use it in future if there is any data lose in Redshift end.

Hope you got few idea/scenarios which I am talking about. And this Persevering files is already exist in Google BigQuery Bulk Load (Cloud Storage)

Even today also I have faced the issue
image

This is COPY command error. The solution is I need to change the datatype for the column in Redshift end. I can simply change the datatype for the column. But once again I need to read the data from source.:frowning:

If suppose I have file in S3, I can simply change the datatype in Redshift end and I can run COPY command with its corresponding S3 file path. But the file is not available in S3 this is a problem.

Hope you got the better idea!

I second this request :slight_smile: