Forum Discussion
This is not surprising.
Oracle databases scope a transaction to a single database connection (unless you are using XA…). In order for a pipeline to group multiple snap operations into a single database transaction, the snaps would need to somehow pass the database connection along from snap to snap for each pipeline invocation (e.g., by using a pipeline-instance-aware connection pool) or use XA transactions.
- philliperamos6 years agoContributor
That sounds good.
I pivoted the data using as JSON, and then used a combo of JSON Splitter/Mapper and Data Validator to validate the data. I still had to hard code some JSON generation scripts, but that was the easy part.
Thanks!
- Ksivagurunathan6 years agoContributor
the way we developed, we don’t need to make changes to the snaplogic pipeline if we want to validate different set of files. it could be new set of validation or few new column validation to the same set of files. all we need to do is add validation rule to the table and corresponding file name. its more dynamic validation of data and column name in a file
- philliperamos6 years agoContributor
Thanks. We’re thinking of doing something similar for the dynamic validation approach, as the way I described above is leading to huge memory problems.