Snaps to copy data to Google Storage
I have a requirement to move/copy many files (1000s of files) to Google Storage. I did some reading in SnapLogic documentation and saw that we might have to use “Google BigQuery Bulk Load (Cloud Storage)” for this. Is this correct understanding? Any more details or any links to sample pipeline or snap configuration would be really helpful. Thanks in advance.Solved4.4KViews0likes2CommentsPattern to ingest data from Google Storage into Google BigQuery using AVRO file format
Created by @pkoppishetty This pipeline is designed for transferring data from a local machine to Google Storage, and then loading data into Google BigQuery using a data source as Google Storage. Configuration Make sure- Upload type is:- “Upload existing files from Google Cloud Storage” and file paths:- “file format”:“AVRO”, “file path”:“gs://gcs_cloud_1/data/leads.avro” Sources: Local machine, Google Storage Targets: Google BigQuery Snaps used: File Reader, File Writer, Google BigQuery Bulk Load (Cloud Storage) Downloads Pattern to ingest data from Google Storage into Google BigQuery using AVRO file format.slp (5.6 KB)3.3KViews0likes0Comments