Often when designing and debugging a Pipeline, the Validate function is not able to be used, as the Pipeline is only designed to be executed as a sub-Pipeline or as a Triggered Task. If we execute the parent Pipeline or trigger the task, then the full document data at each step in the Pipeline is not able to be browsed, we have to rely on the logs in the Dashboard.
Our workaround to this is to insert a Copy > JSON Formatter > File Writer series of Snaps at various points in the Pipeline. However the drawback to this is that it needs to be repeated at every relevant point in the Pipeline (almost after every Snap!), is messy to create / remove and creates a whole heap of files in SLDB which need to be searched and cleaned up.
We’ve requested the feature to be able to execute a Pipeline (triggered, scheduled or Pipeline Execute) with a ‘capture data’ setting which then allows the full input / output document data of each Snap to be browsed in the Dashboard (Service Request #18185).
I’d be keen to hear any thoughts or feedback from the community - is this a feature you would like to see too?