โ12-09-2019 01:15 PM
I have a requirement to move/copy many files (1000s of files) to Google Storage. I did some reading in SnapLogic documentation and saw that we might have to use โGoogle BigQuery Bulk Load (Cloud Storage)โ for this. Is this correct understanding? Any more details or any links to sample pipeline or snap configuration would be really helpful.
Thanks in advance.
Solved! Go to Solution.
โ12-16-2019 03:55 PM
@anivtk You can use the File Writer Snap with the account set to either Google Service or Google Storage. The File name field would be configured like s3:///<bucketName>/<fileName>
.
The Google BigQuery Bulk Load (Cloud Storage) Snap is used to load data from Google Storage into Google BigQuery and isnโt suitable for the requirement described.
โ12-16-2019 03:55 PM
@anivtk You can use the File Writer Snap with the account set to either Google Service or Google Storage. The File name field would be configured like s3:///<bucketName>/<fileName>
.
The Google BigQuery Bulk Load (Cloud Storage) Snap is used to load data from Google Storage into Google BigQuery and isnโt suitable for the requirement described.
โ12-20-2019 07:10 AM
Thanks rohithmadhavan, this worked. ๐