Forum Discussion

rriley99's avatar
rriley99
New Contributor III
2 years ago

For each entry run a pipeline

I want to create a "load" pipeline that will ingest a json with the pipeline name and the parameters to pass to the pipeline. So for example I'd start with a json like this:

{
  "targets": [
    {
      "redshift": {
        "pipeline_name": "redshift_pipeline",
        "connection_string": "foo"
      }
    },
    {
      "s3": {
        "pipeline_name": "s3_pipeline",
        "bucket": "bar"
      }
    }
  ]
}

 And for each item in that list I would want to pass the values in each dict to the "EXECUTE PIPELINE" Snap. 

So my process has been

1. build the iterative item (json)

2. Somehow parse and iterate

3. Use the execute pipeline snap. 

2 Replies

  • alchemiz's avatar
    alchemiz
    Contributor III

    Hi rriley99 ,

    Good day, you can use JSON generator to setup your "targets" then make sure that "process array" to iterate for each object

    In order to pass/receive the "values" in the child you need to have a snap with 1 input open

    Parent pipeline:

    Sample child pipeline:

     

    • rriley99's avatar
      rriley99
      New Contributor III

      Ok I understand, so I am passing this JSON as a parameter, using the JSON GENERATOR Snap but it's just a {"key": "json string"}.

      How does that work?