cancel
Showing results for 
Search instead for 
Did you mean: 

Pipeline execute snap in an ultra task: pool size when reuse executions disabled

omair
Contributor

Hello,

I have an ultra pipeline task that executes a pipeline which uses the Pipeline Execute snap with:

  • Reuse executions unchecked and
  • Poolsize > 1

image

I was expecting this pipeline to process multiple documents in parallel but when I inspect the SnapLogic dashboard, I see that documents are processed sequentially.

Is this the expected behavior of this parameter when reuse executions is disabled? The documentation of the pipeline execute is vague, but reading this

When Reuse is enabled, the Snap starts a new execution only if all executions are busy working on documents or binary data and the total number of executions is below the pool size.

suggests that maybe if Reuse is disabled pool size is disregarded? Is that what’s happening?

If so, how can I configure this pipeline task to be able to process multiple documents in parallel? Do I basically need to configure the ultra pipeline task instances to be > 1?

3 REPLIES 3

igormicev
Contributor

Hi @omair,

What if you group the documents (Group by N, e.g. with size = 10) just before Pipeline Execute, and then in the called pipeline, at the beginning, receive the grouped documents with a JSON Splitter?

BR, Igor

SpiroTaleski
Valued Contributor

Hi @omair

I found the following post at SnapLogic community : Pipeline Execute parallel + sequential execution - #8 by krupalibshah

I think it contains more clarification regarding the parallel processing of the documents.

Regards,
Spiro Taleski

koryknick
Employee
Employee

I think this may be more about the fact that Ultra pipelines really process only one document at a time. So to answer the question of concurrency, yes, you will need to define your Ultra task with “Instances n per snaplex” with at least as many instances as there are nodes in the snaplex and “Max in-flight” configured with a number > 1 to allow for the concurrency you’re looking for.