I am noticing some performance issues when processing larger volumes of data (10 million records) using the Oracle Bulk Loader in Snaplogic. I am able to compare the job as it runs in its current form (another integration tool) to Snaplogic and a job that takes 10 minutes in its current form is taking over and hour in Snaplogic. I am exploring increasing the batch and fetch size on the oracle account, but am wondering what the tradeoffs are and how to determine what is the optimal bach and fetch size since it will impact all processing (we are an oracle database).
Right now our batch and fetch size are set to the default (50, 100)
Any thoughts/suggestions on this topic are greatly appreciated