Conditional Snap Run
Hi everyone,
I have been experiencing an issue lately where I have a large zip file containing 17 CSV files. I unzip the file and write it into an external stage. After that, I run several procedures on Snowflake using Snowflake Multi Execute.
My concern is that SnapLogic tends to unzip File A, write File A into the external stage, and then run the Snowflake Multi Execute Snap, next to unzip File B etc.. What I want is for the Snowflake Multi Execute Snap to only run after the File Writer Snap has successfully finished writing all the 17 files into the stage.
I have tried a workaround where I use a Router Snap to count the output documents from the File Writer Snap. If the count is less than 17, I wait. If it is equal to or greater than 17, I run the following snaps. This workaround works well, but it is not the optimal solution for me, especially if tomorrow the zip file contains 18, 19, or 20 files (the pipeline will automatically fail)
Do you have a better way to design this pipeline?
Thank you.
A few observations and suggestions:
- Without the Aggregate or another snap like it in the pipeline, the Multi Execute will execute for every input document. Adding the Aggregate snap was probably sufficient to make it execute only once, since the Aggregate snap has only one output document. There are other snaps you could use for this, such as Tail or Gate.
- If you're using S3 for staging, consider using the S3 Upload snap. It's much faster than the File Writer snap for writing multiple files.
- Try using Snowflake Bulk Load to load the staged files instead of Snowflake Multi Execute. It should be much simpler.