Forum Discussion
It is possible that JSON Formatter would contribute, especially if following a Gate that consumes the entire input stream into memory. A JSON Splitter may do the same thing since it reads one document at a time to be able to split… if that document is particularly large, it would need enough memory to load it.
The number of snaps doesn’t necessarily contribute to the in-use memory unless you are passing one very large document and the snaps need to allocate and de-allocate the memory.
My recommended remediation would be pipeline redesign to limit the number and type of memory intensive snaps. As with most things, there are a number of ways to accomplish the same goal. The art of this science is to find the best solution for the given environment.
There are some enhancements to many of the snaps I listed that can help limit the amount of memory being used.
- winosky5 years agoNew Contributor III
Ok thanks for the insight Kory, appreciate the help.