I have a single pipeline (doesn’t have any child pipelines) that is processing a file from an AWS S3 bucket, at the end of the pipeline I want to move that file to a different S3 bucket. The pipeline is getting the S3 key from an SQS queue at the beginning and the key is unique/dynamic every time the pipeline is executed. What is the best way to store that dynamic S3 bucket key value during pipeline execution at the beginning and retrieve it at the end? I can’t store it as a pipeline parameter since they can’t change during pipeline execution. I know I can store the value in a file and retrieve it later from the file but that’s obviously not an optimal solution. Is there a way in Snaplogic to save small data variables in memory and retrieve them later?
Welcome @ausman - you are correct that you cannot “save” a value back to pipeline parameters, but there are a some easy ways to do what you are looking to do.
One simple method is to have a main pipeline that reads from the SQS queue then calls a child pipeline to do the work on that file. Then the main pipeline can use the $original object value that returns from the Pipeline Execute snap to get the object that was originally passed into the child pipeline.
Another way (which in my opinion is a little more cumbersome) is to use the Copy snap after the SQS read and a Join snap using the Merge merge type after you have completed the work on the file so you can re-use that original value that was returned from the SQS queue.
I’m sure some other clever folks could give you other ways to solve the same issue, but these are pretty simple to understand and implement.
Hope this helps!