12-09-2019 01:15 PM
I have a requirement to move/copy many files (1000s of files) to Google Storage. I did some reading in SnapLogic documentation and saw that we might have to use “Google BigQuery Bulk Load (Cloud Storage)” for this. Is this correct understanding? Any more details or any links to sample pipeline or snap configuration would be really helpful.
Thanks in advance.
Solved! Go to Solution.
12-16-2019 03:55 PM
@anivtk You can use the File Writer Snap with the account set to either Google Service or Google Storage. The File name field would be configured like s3:///<bucketName>/<fileName>
.
The Google BigQuery Bulk Load (Cloud Storage) Snap is used to load data from Google Storage into Google BigQuery and isn’t suitable for the requirement described.
12-16-2019 03:55 PM
@anivtk You can use the File Writer Snap with the account set to either Google Service or Google Storage. The File name field would be configured like s3:///<bucketName>/<fileName>
.
The Google BigQuery Bulk Load (Cloud Storage) Snap is used to load data from Google Storage into Google BigQuery and isn’t suitable for the requirement described.
12-20-2019 07:10 AM
Thanks rohithmadhavan, this worked. 🙂