Forum Discussion
Yes, it would be very helpful if the stored procedure snap could capture the DBMS_OUTPUT. In the meantime, I’ve written a procedure that writes to a file, that we can use to replace the DBMS_OUTPUT calls. But it does mean substantial changes to each oracle program we want to run from Snaplogic.
Also, It would be helpful regarding the global temp table report issue, if Snaplogic didn’t run multiple pl/sql procedure calls concurrently within the same Oracle session. Right now, if program A and program B are run concurrently from 2 different pipelines, they both share the same Oracle session–therefore the same temporary table. Both write report data to the same instance of the temporary table. Whichever program ends first, produces a report containing mixed output from both A and B.
Enabling and capturing DBMS_OUTPUT looks very doable, but it is some work. If you would like this feature, it would great if you could log a support ticket requesting this.
Thanks for clarifying the GTT issue. Actually, you were pretty clear in your first post. The stored procedure snap is associated with an Account which will try to re-use its Connection object (this maps to a db session in Oracle).
When you describe the mixed output, I’m suspecting that the StoredProcedure snaps in different pipelines refer to the same account. If that’s true, you could experiment with cloning the account (ex: Account2) and then referring to that account in one of the pipelines. If GTT data is private to a db session, then concurrent pipeline runs should write to different GTT tables?
- philliperamos6 years agoContributor
That sounds good.
I pivoted the data using as JSON, and then used a combo of JSON Splitter/Mapper and Data Validator to validate the data. I still had to hard code some JSON generation scripts, but that was the easy part.
Thanks!
- Ksivagurunathan6 years agoContributor
the way we developed, we don’t need to make changes to the snaplogic pipeline if we want to validate different set of files. it could be new set of validation or few new column validation to the same set of files. all we need to do is add validation rule to the table and corresponding file name. its more dynamic validation of data and column name in a file
- philliperamos6 years agoContributor
Thanks. We’re thinking of doing something similar for the dynamic validation approach, as the way I described above is leading to huge memory problems.