I understand that SnapLogic is not perceived as a typical Migration tool for one time migration (i.e. an ETL tool) but more as an integration tool (iPaaS) for Integrations. Is that true ?
If not, can SnapLogic be used for One time Migration and how much load can it handle ?
For e.g. say we want to bring in a few million records in SnapLogic say Customer Master for the sake of this conversation, apply business rules and transformations on source data fields and then send it to downstream systems. Will SnapLogic be able to handle such a high volume load with transformations ? In short, can it be used for like an ETL tool that is typically used for One time Data loads as well?
Thanks in advance.
SnapLogic can be used for both small and large volumes of data. The streaming architecture in fact lends itself be being very efficient for those larger batch operations as it does not keep the whole data set in memory at any point in time.
For those larger batch loads, what is probably more relevant is the availability of the bulk capabilities on the endpoint. For some endpoints, like Salesforce, Oracle, Redshift and a number of others, SnapLogic provides specific Snaps which support the bulk capabilities of the endpoint.
I have scenario like need to read 700 MB to 1 GB of gzip and CSV in multiple files from FTP and then need to perform some transformation operation then finally load into target(may be Redshift/ Google BigQuery).
In this case what is the best and optimized approach for load? and Is SnapLogic handle this kind of huge volume of data? approximately for single batch 4 - 6 GB of data files from FTP.
Kindly suggest for best optimized approach.Thanks in advance!
HI , I am having huge volume in GB's 16GB , will Snaplogic will handle this in direct extract and load to snowflake using bulk loads. will be there any issues wrt to resource contention and it will run for hrs right?