Passing parameters/session variables between nested pipelines/snaps

Early warning - I’m a newbie to the platform…

I want to wrap up various bits of logic/lookups/data retrieval in reusable pipelines that mask the underlying complexity (eg finding and reading a particular data source without knowing the storage protocol, access keys etc), but cant get to grips with passing metadata in and out between pipelines and snaps. Should I be passing a [control] document stream in (with 1 doc, eg a json containing key/value metadata pairs) so that i can pass onwards to the next step (and another doc stream with some actual data documents) rather than passing parameters (which I cant see a way of updating/passing out again)?

If it should be via parameters I’d need to use the Script snap to read/write these, but cant find any examples/refs for reading pipeline parameters or writing/updating parent parameters? Just input/output/error/log objects available via Script snap?

If its via a control/metadata doc is there a way to define the document schema so i can use mapper/filter etc or will I just need to handwrite some python/javascript to read/parse/do-work/create-updated-metadata-doc/self.output.write(newmetadata) ?

Thanks,
Mike

If you have sample pipelines, seeing those would be helpful, if possible.

While the exact answer is going to depend on application, I’d generally say to use documents if you are going to be using Pipeline Execute (https://doc.snaplogic.com/wiki/display/SD/Pipeline+Execute), since with reuse turned on (e.g., you spin up the child pipeline once, instead of spinning it up and tearing it down per document) the pipeline parameters can only be set once.

You should be able to access pipeline parameters in the Mapper by using an underscore instead of a dollarsign - i.e. $foo if it’s a variable, _foo if it’s a parameter. If you need the parameters returned, you can add a Mapper at the end, check “Pass Through”, and then add in the parameters you want to return using this syntax.

Is that along the lines of what you’re looking for?

Shayne

thanks a lot Shayne, I will go with parameters and try using the underscore syntax to pass them out and along the workflow. will post again later with an update on how i got on :slight_smile:

hmm, snaps vs scripts table could be a bit more prominent in the help/docs… ie Script snap:

Does not support pipeline _parameters
Supports only Document input/output
Does not support Accounts
Often awkward to debug
Cannot be unit tested

I agree the Script snap is awkward to debug… :). And am surprised there’s no feature to pass in/out parameters. Some stuff cant be achieved with the standard Transform snaps, but i have to fake a document as a ScriptHook input to pass in a parameter and then try and work out the syntax with the Mapper to take the subsequent output and write that to a pipeline parameter to be used later in the flow by other snaps. I thought it would be simple/quick to do some basic js/python in a Script but looks like i may have to author a custom Snap with the SDK. Will persevere a few more hours with the Script snap…

“Does not support pipeline _parameters” - on the contrary Script snap does support pipeline params

It can be accessed in a script snap by $_pipeline_param_name

Example:

Python

        data["pipelineparam"] = $_pipeline_param

JavaScript

         new_data.pipelineparam = $_pipeline_param;

also documented over here - https://doc.snaplogic.com/wiki/display/SD/Parameters+and+Fields

2 Likes

I’ll have development look at the Snaps vs Scripts table in the developer docs.

I can successfully pass in/out/thru parameters using the Execute Pipeline snap - I couldnt get the Child Pipeline approach to work ($content, _param, $var, $_param never came out the other end) so will just Execute Pipeline with flow control items/choices - I’m feeling pretty confident now so thanks :slight_smile: And will be using the suggested python syntax when in Script snaps.
Mike

Sorry to come back on this thread but…

I can’t update/set a pipeline parameter via Mapper - when I type “_mypipelineparam” as the Target Path in a Mapper, it gets corrected to “$_mypipelineparam”. I finally worked out the param wasnt being set by dumping it out via a Mapper to copy param to $content then write to File Writer.

When I use a variable eg “$myvariable” in the Mapper, the value is set fine. Confirmed with the same dump to file flow.

But I am needing to use this dynamically calculated value (variable or parameter) as the URI/location for a File Reader snap which does accept a parameter (which I cannot set in the Mapper) but doesnt accept a variable (which I can set).

Is there a way to pass a dynamic value to File Reader? This is quite frustrating and seems inconsistent you can use $xx for some Snaps and not others, and that you can’t set _parameters in the Mapper. Again apologies for any newbie mistakes…

_Param’s cannot be set using a mapper or any other means except pipeline param’s, think of it as a global param which you set it at pipeline level, you can use it across pipelines but can be set at a global/pipeline level. Now when it comes to mapper or any other snap that lets you set variables via $varName notation, this are like local vars which can have values set via mapper and are available to the immediate following snap.

when I type “_mypipelineparam” as the Target Path in a Mapper, it gets corrected to “$_mypipelineparam

and this is the expected behavior, now lets say you are getting some value from an external call, ex: you invoke a triggered task and when you do that you pass on a “fileName” param to this task which is then used in the pipeline as a fileName + timeStamp + .extn in your file writer snap. In that case you can use something like this in your File Writer snap settings

‘/some/project/path’ + _fileName + $timeStamp + ‘.json’

where timeStamp is set via a mapper like this

mapper

you could have also created a $fileToWriteTo variable in a mapper which will have this expression

‘/some/project/path’ + __fileName + ‘_’ + Date.now().toLocaleDateTimeString() + ‘.json’

Hope it makes sense!

Sorry my point was that File Reader doesn’t accept a $variable parameter, so I am at a dead end as I can’t write a value to the pipeline _parameter either which it does accept.

File reader does accetpt $ and _ variables, did you toggle the = button ?

imagetoggle-me

@Bhavin, I think what Mike is saying is that the File Reader has a binary input which requires a binary formatter of some sort which does not pass through the document variables that would otherwise be available to the File Reader if it had a JSON input.

@mikeandrews, I’ve run into the same issue and the only way I’ve been able to get this to work in the past is with two pipelines. The first ending with a Pipeline Execute or For Each snap which would assign the filename variable value to a parameter provided by the second pipeline which contained the File Reader snap.

For the same or similar use case, a couple of years ago, I had suggested to a SnapLogic representatives the need to make pipeline parameters variable rather than static. I even suggested a new variable array could be added as a property to the pipe object. However, this was an in-person, casual conversation and may not have made it to the request queue.

[Added: I have made corrections to this post in my following response]

Binary documents have a “header” with properties that can be referenced using dollar variables. When looking at the preview data for a binary view in Designer, you should be able to see what properties are available, if any. Unfortunately, most Formatter snaps don’t have a way to fill in the header document with properties, which limits the possibilties. It’s a gap that needs to be addressed.

They are effectively variable when used through PipeExec, as you described, and that is the correct approach at this time. The reason parameters cannot be set is because all of the snaps are running in parallel. So, it’s impossible to say what the value of a parameter is at any point is if they can be changed. For example, if you had a Mapper setting a pipeline parameter followed by a JSON-Formatter and a File-Writer, the Mapper could have processed one to hundreds of documents before the Writer read the value of the pipeline parameter. Now, imagine if the value computed for the parameter changed for every document that it processed, you would never get consistent results.

I should apologize and make some corrections after re-reading the above… My last comment probably didn’t make much sense because my use case was around the File Writer instead of the File Reader, so it didn’t fit this post. Nevertheless, now that I’ve accidentally hijacked the post and the conversation has continued… :slight_smile:

@mikeandrews, I agree @Bhavin’s previous post should resolve your issue. What has not been included (if it’s not obvious) is that you can configure the File Reader snap with an input view so that the variables can be passed through.

Now, back to my unintentional hijack… @tstack,

Thank you very much for that explanation; that makes complete sense!

I’ve personally not run into any other use case other than the above mentioned, so if the Formatter snap can be extended in some way to pass along variable header info, then the issue is solved (and I can reduce a small number of secondary pipelines).

Thanks
- Del

@Bhavin thanks again, I didnt toggle the “=” - $variable now works a treat in File Reader :slight_smile:

Thank you for your information on the syntax of pipeline parameters in the script! Couldn’t find this in the documentation.