- List directory for documents. Output: e.g. 200 filenames
- Pass every filename to Pipeline2 asynchronously
Pipeline2 (will be started 200 times)
- Get filename
- Read file
- and so on…
The goal is to decouple Pipeline1 and Pipeline2.
I just want to trigger Pipeline 2 with the filename from Pipeline1, fire-and-forget.
Once triggered, if pipeline2 fails, it does not concern pipeline1.
Is this doable in SL?
If i use Pipeline Execute snap - even if pipeline2 returns “OK” json message right after start - it seems, that if it fails, the root pipeline stops with error message from pipeline2
If pipeline1 than gets an error message, other pipeline2 fails with “The root pipeline execution is no longer running”.
This should be fully decoupled? Any ideas? Is REST Post Execute going to work better?