06-13-2017 07:38 AM
I am currently using the Pipeline Execute snap to execute a child pipeline inside of my parent pipeline. The Pipeline Execute snap allows you to dynamically determine which pipeline to execute. Since the pipeline to execute is dynamic, so are the name and number of pipeline parameters that could be needed in the child pipeline. For example, child pipeline A might have 3 parameters, while child pipeline B might have only 1 parameter. Assume that the parent pipeline has a JSON object that contains the key-value pairs of the parameters that need to be passed into the Pipeline Execute snap. Is there a way to pass in those parameters to the child pipeline?
I assume I could call a URL triggered task for the child pipelines and pass in the parameters through the URL, but I would prefer to keep using the Pipeline Execute snap instead.
06-13-2017 09:29 AM
Unfortunately, it’s not really possible at this time, but it’s something we should try to support. You can sort-of workaround it by passing an object in through a single parameter with JSON.stringify() and then unpacking it in the child pipeline with JSON.parse(). But, the child pipelines would have to be designed to accept parameters that way.
08-11-2017 09:23 AM
Did you try this?
Prepare an object with a reference to these key-value pair of parameters in the master.
In the child, once you receive the object,(not through parameters but the document), decode and decide which pipeline to use along with these parameters assigned from this object.
Hope that makes sense.