Managing Batch Processing Errors in SnapLogic with DB2 and Salesforce
Hi Team, I’m working on a pipeline where I read data from Salesforce, insert it into a DB2 database using the Generic JDBC Execute snap, and then update Salesforce again based on the insert results. The challenge I’m facing is with the batch processing behavior of the Generic JDBC Execute snap: When inserting records into DB2 in batch mode, if any one record fails, the entire batch is routed to the Error view. However, DB2 is still inserting the valid records of the batch, even though the snap treats the whole batch as failed. As a result, when I update Salesforce: Successfully inserted records are incorrectly marked with an error status (3).
Actual error records are also marked as 3.
This leads to wrong updates back to Salesforce, because there is no way to differentiate which records actually failed vs. which ones succeeded but were included in the failed batch. My Question Is there a recommended approach or best practice in SnapLogic to:
- 1.
Identify only the truly failed records in a JDBC batch operation,
- 2.
Prevent valid records from being marked as errors when one record in the batch fails, or
- 3.
Force the batch to behave atomically (all-or-nothing), or process inserts individually?
Any guidance, design pattern, or snap-level configuration to handle this kind of partial failure scenario would be really helpful. Thanks in advance!
