cancel
Showing results for 
Search instead for 
Did you mean: 

Valdiate Pipeline Execute

Henchway
Contributor

Hi everyone,

I’ve been wondering if, when validating pipelines, there is any way to jump into pipelines which were run with a Pipeline execute to review the individual snaps?
So far i’ve been taking the output of the initial pipeline, stage it in the execute and then run it there, however this is more than cumbersome and seems like a lot of wasted potential.

Possibly i’ve just missed some option, this seems to me as a must-have feature.

Best regards
Thomas

1 ACCEPTED SOLUTION

ForbinCSD
Contributor

When you say “which were run with a Pipeline execute”, may I assume you mean you want to validate a child pipeline that was fed from a “parent” pipeline?

If so, then the “Record Replay” snap is a great solution to this.

Your child pipeline should have a single open input somewhere in the pipeline where it receives the document stream from the parent. Put the Record Replay snap just before it and save the pipeline. Then go back and execute or validate the parent pipeline so that it feeds the child with a docstream. Even if the child fails, if the parent otherwise ran and gave it a docstream, you’re good. It will have saved this as a document with a unique filename in the project.
image
Now you can open the child pipeline in the SnapLogic Designer canvas and validate it. The replay snap will get its input from the file and feed it to the next snap, and now you can see what’s happening.

Note: If, however, you have a manually created or faked doc stream of your own, don’t try to put it in a file and have the replay snap read from it. That won’t work, because it has it’s own funky syntax for writing the JSON. Instead of the replay snap, use a JSON generator snap and put your “faked” or manually created documents there.
image

Does this help?

View solution in original post

4 REPLIES 4

ForbinCSD
Contributor

When you say “which were run with a Pipeline execute”, may I assume you mean you want to validate a child pipeline that was fed from a “parent” pipeline?

If so, then the “Record Replay” snap is a great solution to this.

Your child pipeline should have a single open input somewhere in the pipeline where it receives the document stream from the parent. Put the Record Replay snap just before it and save the pipeline. Then go back and execute or validate the parent pipeline so that it feeds the child with a docstream. Even if the child fails, if the parent otherwise ran and gave it a docstream, you’re good. It will have saved this as a document with a unique filename in the project.
image
Now you can open the child pipeline in the SnapLogic Designer canvas and validate it. The replay snap will get its input from the file and feed it to the next snap, and now you can see what’s happening.

Note: If, however, you have a manually created or faked doc stream of your own, don’t try to put it in a file and have the replay snap read from it. That won’t work, because it has it’s own funky syntax for writing the JSON. Instead of the replay snap, use a JSON generator snap and put your “faked” or manually created documents there.
image

Does this help?

Thank you very much, that’s exactly what i’ve been looking for.
So far i’ve been staging the data with a JSON Generator or File reader of the child pipeline as you suggested in your second screenshot.

The replay should help immensly with that.

Best regards
Thomas

Supratim
Contributor III

@Henchway Are you looking for the output/response comes from child pipeline? If yes make sure your child pipeline’s all snaps are “validate and execute” status also your child pipeline should have one open output view then you will get response from child pipeline using “Pipeline Execute snaps”

Not just the Output of a child pipeline, but my intention is to verify the child pipeline as well and inspect the individual Snaps in there.
The reply of ForbinCSD has pretty much answered it.

Best regards
Thomas