02-28-2018 08:14 AM
I am noticing some performance issues when processing larger volumes of data (10 million records) using the Oracle Bulk Loader in Snaplogic. I am able to compare the job as it runs in its current form (another integration tool) to Snaplogic and a job that takes 10 minutes in its current form is taking over and hour in Snaplogic. I am exploring increasing the batch and fetch size on the oracle account, but am wondering what the tradeoffs are and how to determine what is the optimal bach and fetch size since it will impact all processing (we are an oracle database).
Right now our batch and fetch size are set to the default (50, 100)
Any thoughts/suggestions on this topic are greatly appreciated
01-23-2019 05:37 PM
The fetch size should not affect the load, it is used when reading from the database with the SELECT Snap.
With the batch size, that will only affect when you use the INSERT or UPDATE. If you are using the BULK Snaps, you should not see any such difference, it will be using SQLLDR under the covers.
01-08-2021 06:31 AM
Does the Oracle Bulk Load snap make use of the ROWS SQL*Loader Command-Line Parameter or if the parameter isn’t used will the default value be used?