12-23-2019 12:06 PM
Hi,
I’m new to SnapLogic, so apologies in advance if this is a question with an obvious answer.
We have a data lake that generates SNS notifications when different files/object types are uploaded into it, and would like to create pipelines that run in response to sets of uploads/notifications.
So, we’d like to create a parent pipeline that uses the SQS Consumer snap attached to a SQS that subscribes to the data lake SNS notifications. And, when message types “A”, “B”, and “C” are all received, it uses a Pipeline Execute snap to kick off child pipeline “X” one time. When message types “D”, “E”, and “F” are all received it uses a Pipeline Execute snap to kick off child pipeline “Y” one time, and so on.
What is the best practice way to queue up / keep track of these sets of messages across pipeline executions?
Thank you
12-23-2019 09:41 PM
Hi,
After your SQS consumer Snap, use a router Snap and route the data based on message type, message type A,B,C goes to one route and D,E,F goes the other route. To the the output of router, use pipeline exec snap, route 1 calls pipeline X and route 2 calls pipline Y. Perfom your logics in child ppeline and then acknowledge the message. In pipeline execute Snap use Reuse executions to process documents option. When this is enabled, the Snap starts a child execution and pass multiple inputs to the execution. Reusable executions continue to live until all of the input documents to this Snap have been fully processed. If this flag is not enabled, then a new Pipeline execution is created for each input document.
Regards
Anubhav
12-24-2019 05:51 AM
Thank you Anubhav. This sounds like what we’ve been looking for.
Joe