cancel
Showing results for 
Search instead for 
Did you mean: 

DBMS_OUTPUT and GTT issues running PL/SQL programs in Snaplogic

wpenfold
New Contributor III

We are having issues with output when running our PL/SQL programs from Snaplogic.

Many of our existing pl/sql packages use DBMS_OUTPUT to write some information about the execution of the job–like row counts, any warning messages etc. When run from within Oracle, the DBMS_OUTPUT buffer gets automatically written to a log file. When run from Snaplogic, no log file is written.

PL/SQL programs that produce a more detailed report, use a Global Temporary Table, to build the report during processing, then write the data from the GTT to the report file at the end of the program. This doesn’t work in Snaplogic either, because instead of each job running in it’s own session, all of the oracle jobs that snaplogic runs use the same session–hence they share the same instance of the GTT. If multiple programs are running at the same time the reports become gobbledegook.

The end result is, every legacy PL/SQL program that we want to run from Snaplogic has to be rewritten.

Anyone else having this issue? What have you done about it?

6 REPLIES 6

mbowen
Employee
Employee

Hi Wendy:

SnapLogic supports lots of databases; many share the same logic for invoking stored procedures/functions. Many databases have vendor-specific hooks, etc of which some have been implemented by our db snaps to offer good support for a particular DBMS. “Good” isn’t necessarily complete support though.

The Oracle Snap Pack isn’t DBMS_OUTPUT aware, so doesn’t offer a convenient way to capture it. Collecting DBMS output in PL/SQL and returning it an OUT parameter would work; maybe you are thinking of other ways too. Anyway, that’s extra work for presumably working code – no good.

You could log a support ticket requesting better support for this. Peeking at the code, maybe we could add a property to the Stored Procedure snap (for example, “Capture DBMS_OUTPUT”) that would enable DBMS_OUTPUT (with a max buffer size?), execute the stored procedure, and then collect the lines in the DBMS_OUTPUT buffer. This could be returned in the snap output view for inspection/further processing.

I’m not sure if this would satisfy your use cases. You could elaborate if not. I’m also curious how other users are getting around this.

wpenfold
New Contributor III

Yes, it would be very helpful if the stored procedure snap could capture the DBMS_OUTPUT. In the meantime, I’ve written a procedure that writes to a file, that we can use to replace the DBMS_OUTPUT calls. But it does mean substantial changes to each oracle program we want to run from Snaplogic.

Also, It would be helpful regarding the global temp table report issue, if Snaplogic didn’t run multiple pl/sql procedure calls concurrently within the same Oracle session. Right now, if program A and program B are run concurrently from 2 different pipelines, they both share the same Oracle session–therefore the same temporary table. Both write report data to the same instance of the temporary table. Whichever program ends first, produces a report containing mixed output from both A and B.

Enabling and capturing DBMS_OUTPUT looks very doable, but it is some work. If you would like this feature, it would great if you could log a support ticket requesting this.

Thanks for clarifying the GTT issue. Actually, you were pretty clear in your first post. The stored procedure snap is associated with an Account which will try to re-use its Connection object (this maps to a db session in Oracle).

When you describe the mixed output, I’m suspecting that the StoredProcedure snaps in different pipelines refer to the same account. If that’s true, you could experiment with cloning the account (ex: Account2) and then referring to that account in one of the pipelines. If GTT data is private to a db session, then concurrent pipeline runs should write to different GTT tables?

wpenfold
New Contributor III

Thanks. I’ll submit a ticket about supporting DBMS_OUTPUT.

As far as the GTT issue, we have over 100 pipelines running now, and many more to convert from our current scheduling software. I don’t think that creating and using a different account for each pipeline would be feasible.

I don’t know if Snaplogic could handle db connections any differently…? We have multi-user PHP web applications using connection pooling that successfully call Pl/SQL packages using GTTs without collision, so it seems like it ought to be possible.