cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Referencing an earlier Snap

rriley99
New Contributor III

Hello.

I'm trying to pass a JSON as a pipeline parameter, and I've prefixed my pipeline with a Mapper Snap to "parse" the JSON and the output of the Mapper is a JSON object I can reference later in my pipeline. A sample of this JSON is such:

{
"extract"
: {
"source": {
"schema": "public",
"table": "table",
"account": "../../shared/source-dev",
"pipeline_name": "projects/shared/query_extract",
"load_type": {
"full-load": {}
},
"target": {
"s3_account": ",../../shared/s3",
"target_bucket": "dev-bucket",
"target_path": "snaplogic_test.csv"
}
}
}
}

And here is the Mapper (the output is "input_params"):

rriley99_1-1698947079487.png

I've reduced my pipeline to the minimum pipeline where I still have this problem. 

rriley99_4-1698947565998.png

I have my Mapper mentioned above, a Pipeline Execute to query against the source, a Router for doing different actions based on if there are results, and if so a Pipeline Execute to write that data to S3. I think I'm missing a few things like parsing documents into a JSON or whatever for s3, but I'll cross that bridge when I get there.

The problem I am running into is that in the router and pipeline execution, I can reference "$input_params.etc" in the expression logic, but once I am past that I can no longer reference it. 

So my questions are:

  1. How can I reference that mapper output "input_params" for my s3 pipeline?
  2. Is there a better way to take a single JSON parameter and make its parts available throughout the entire pipeline? 
1 ACCEPTED SOLUTION

rriley99
New Contributor III

Ah, I don't know how I missed this. So I was able to do something like "JSON.parse(_input_params).foo.bar.bin". It's the same step as was in the mapper. 
I'm not sure if this is more efficient than having a "set variable" function, but since that's not possible I think this will suffice.

View solution in original post

3 REPLIES 3

alchemiz
Contributor III

Hi @rriley99 ,

Good day, try passing the JSON parameter in Base64.encoded format then decode the string during parsing .. 

alchemiz_0-1699144709966.png

As a suggestion you can use the expression library where you can put all this properties and yes you can access this properties anywhere in the pipeline by using the lib namespace e.g lib.inputParams.extract.target.target_bucket

https://docs-snaplogic.atlassian.net/wiki/spaces/SD/pages/1438110/Expression+Libraries

Encoded Base64 json
ew0KICAgICJleHRyYWN0Ijogew0KICAgICAgICAic291cmNlIjogew0KICAgICAgICAgICAgInNjaGVtYSI6ICJwdWJsaWMiLA0KICAgICAgICAgICAgInRhYmxlIjogInRhYmxlIiwNCiAgICAgICAgICAgICJhY2NvdW50IjogIi4uLy4uL3NoYXJlZC9zb3VyY2UtZGV2IiwNCiAgICAgICAgICAgICJwaXBlbGluZV9uYW1lIjogInByb2plY3RzL3NoYXJlZC9xdWVyeV9leHRyYWN0IiwNCiAgICAgICAgICAgICJsb2FkX3R5cGUiOiB7DQogICAgICAgICAgICAgICAgImZ1bGwtbG9hZCI6IHt9DQogICAgICAgICAgICB9DQogICAgICAgIH0sDQogICAgICAgICJ0YXJnZXQiOiB7DQogICAgICAgICAgICAiczNfYWNjb3VudCI6ICIsLi4vLi4vc2hhcmVkL3MzIiwNCiAgICAgICAgICAgICJ0YXJnZXRfYnVja2V0IjogImRldi1idWNrZXQiLA0KICAgICAgICAgICAgInRhcmdldF9wYXRoIjogInNuYXBsb2dpY190ZXN0LmNzdiINCiAgICAgICAgfQ0KICAgIH0NCn0

Thanks,
EmEm

rriley99
New Contributor III

Every pipeline (not execution) would have a different set of parameters, so I don't think using an Expression Library would be optimal (not static enough). It wouldn't be a deal breaker, but I would REALLY want these to be human-readable for debugging and monitoring purposes.

rriley99
New Contributor III

Ah, I don't know how I missed this. So I was able to do something like "JSON.parse(_input_params).foo.bar.bin". It's the same step as was in the mapper. 
I'm not sure if this is more efficient than having a "set variable" function, but since that's not possible I think this will suffice.