Forum Discussion

PSAmmirata's avatar
PSAmmirata
Employee
8 years ago

Snap peak memory usage and estimating required memory for a Snaplex node

The “Memory Allocated” section of the “Check Pipeline Execution Statistics” documentation page states, “Note that this number does not reflect the amount of memory that was freed and it is not the peak memory usage of the Snap. So, it is not necessarily a metric that can be used to estimate the required size of a Snaplex node.” I am interested in being able to determine the peak memory used by a Snap during execution, or at least the maximum that could be used. Is this possible?

Are the following valid?

Sort snap - Wouldn’t be able to determine the peak memory used, but the maximum that could be used can be determined by looking at the Maximum memory %.

In-memory Lookup snap - The maximum that could be used can be determined by looking at the Maximum memory %, and the peak memory used would be the total memory allocated from the pipeline execution statistics.

Join snap (with unsorted input streams) - would this be similar to the Sort snap?

Aggregate snap - ???

Also, is there any way to estimate the required memory for a Snaplex node? Or any way to estimate how much of a workload a given amount of memory can support?