08-06-2023 02:23 PM
I want to create a "load" pipeline that will ingest a json with the pipeline name and the parameters to pass to the pipeline. So for example I'd start with a json like this:
{
"targets": [
{
"redshift": {
"pipeline_name": "redshift_pipeline",
"connection_string": "foo"
}
},
{
"s3": {
"pipeline_name": "s3_pipeline",
"bucket": "bar"
}
}
]
}
And for each item in that list I would want to pass the values in each dict to the "EXECUTE PIPELINE" Snap.
So my process has been
1. build the iterative item (json)
2. Somehow parse and iterate
3. Use the execute pipeline snap.
08-22-2023 12:11 PM
Hi @rriley99 ,
Good day, you can use JSON generator to setup your "targets" then make sure that "process array" to iterate for each object
In order to pass/receive the "values" in the child you need to have a snap with 1 input open
Parent pipeline:
Sample child pipeline:
08-23-2023 01:23 PM
Ok I understand, so I am passing this JSON as a parameter, using the JSON GENERATOR Snap but it's just a {"key": "json string"}.
How does that work?