cancel
Showing results for 
Search instead for 
Did you mean: 

Conditional Snap Run

salishrodinger
New Contributor II

Hi everyone,

I have been experiencing an issue lately where I have a large zip file containing 17 CSV files. I unzip the file and write it into an external stage. After that, I run several procedures on Snowflake using Snowflake Multi Execute.

My concern is that SnapLogic tends to unzip File A, write File A into the external stage, and then run the Snowflake Multi Execute Snap, next to unzip File B etc.. What I want is for the Snowflake Multi Execute Snap to only run after the File Writer Snap has successfully finished writing all the 17 files into the stage.

I have tried a workaround where I use a Router Snap to count the output documents from the File Writer Snap. If the count is less than 17, I wait. If it is equal to or greater than 17, I run the following snaps. This workaround works well, but it is not the optimal solution for me, especially if tomorrow the zip file contains 18, 19, or 20 files (the pipeline will automatically fail)

Do you have a better way to design this pipeline?

Thank you.

Capture d’écran 2023-09-19 à 10.46.29.png

1 ACCEPTED SOLUTION

ptaylor
Employee
Employee

A few observations and suggestions:

  • Without the Aggregate or another snap like it in the pipeline, the Multi Execute will execute for every input document. Adding the Aggregate snap was probably sufficient to make it execute only once, since the Aggregate snap has only one output document. There are other snaps you could use for this, such as Tail or Gate.
  • If you're using S3 for staging, consider using the S3 Upload snap. It's much faster than the File Writer snap for writing multiple files.
  • Try using Snowflake Bulk Load to load the staged files instead of Snowflake Multi Execute. It should be much simpler.

View solution in original post

3 REPLIES 3

marjan_karafilo
Contributor

HI @salishrodinger ,

You can create a child pipeline in order to write the files into the external stage.

After all files are processed successfully, you can continue with the process.

 

Thank you @marjan_karafilo for you reply, I did try back then to create a child pipeline right after the File Writer so it can process all my Snowflake executions. Parametered the child pipeline as a batch of 30 but somehow i did have some errors too

ptaylor
Employee
Employee

A few observations and suggestions:

  • Without the Aggregate or another snap like it in the pipeline, the Multi Execute will execute for every input document. Adding the Aggregate snap was probably sufficient to make it execute only once, since the Aggregate snap has only one output document. There are other snaps you could use for this, such as Tail or Gate.
  • If you're using S3 for staging, consider using the S3 Upload snap. It's much faster than the File Writer snap for writing multiple files.
  • Try using Snowflake Bulk Load to load the staged files instead of Snowflake Multi Execute. It should be much simpler.