Forum Discussion
What is the issue with multiple open outputs? It is not a functional problem for the pipeline. Without a connecting snap the output of the output parameter view will be discarded.
If it is really a problem AND your results sets will never be greater than 2147483647. You could attach an Exit Snap and set the exit value to 2147483647.
Or
Use a filter snap connected to a json formatter connected to a file writer. Set the expression in the filter snap to false and no documents will ever be pass through. You will have to assign a name in the file writer and it will write an empty json file, but you can put it in sldb without worrying about size constraints.
Best regards,
Rich
- IgnatiusN8 years agoNew Contributor II
This pipeline is a sub-pipeline of an integration pipeline, and having multiple outputs causes problems.
- del8 years agoContributor III
@IgnatiusN, I used to use a Script snap (with output view removed) for this.
Python Example:
from com.snaplogic.scripting.language import ScriptHook from com.snaplogic.scripting.language.ScriptHook import * class MFParser(ScriptHook): def __init__(self, input, output, error, log): self.input = input self.output = output self.error = error self.log = log def execute(self): while self.input.hasNext(): data = self.input.next() hook = MFParser(input, output, error, log)
I have since written a custom snap that does the same thing and is slightly better in performance (when measured by milliseconds).