ContributionsMost RecentMost LikesSolutionsRe: What is the purpose of the "Max in-flight" parameter in an ultra task? What’s the downside to lowering this parameter? It will slow down the performance of the happy path since it disables prefetching requests from the feed-master. Can you elaborate this statement I did not follow it Re: Feedmaster queue pileup and the response time for ultra pipeline hi @KumarR did you find any solution for this, what was the issue? Re: Snaplex Node Freezing with High Memory hi @akidave, we have written our own js script in the pipeline flow, and used java collections in it. can this lead to memory leak. as I believe there must be some GC activity taken place by snaplogic jvm itself. do we have to care about the variables that we declare in our javascripts ? we are facing same memory leak issue, even after finishing pipeline execution the memory is not getting freed. as you said it is unusual so does that mean our custom javascript code is leaking memory and if that is the case the variables memory clean up task should be performed by snaplogic itself. Regards, Aditya Kurhade. Re: Changing snaplex of a task using api or any automated process thanks it worked Re: Changing snaplex of a task using api or any automated process hi @tlui, thanks for your reply, here while reading task using snaplogic read snap we are not able to get snaplex parameter in the metadata, I have attached the sanp. you can see in parameters object there is no trace of snaplex. I even tried adding parameters.snaplex property explicitly which leads to an error saying 'additional parameters are not allowed in metadata while updating snap ’ Changing snaplex of a task using api or any automated process As we all know that when we migrate any project from one location to another the snaplex of the tasks remains the same. Now here is one problem let us say we are migrating project from one environment to another (eg development to Testing) we want to change the snaplex of all the task. (Right now we have DEV snaplex and QCVMWARE snaplex). so can anybody suggest any solution to change the snaplex of the tasks through any automated procedure or by rest api so that we don’t need to go in each task and manually change the snaplex. Re: Triggered task showing wrong error message! There is no API hit limit for groundPlex right ? how can we file support case? Re: Triggered task showing wrong error message! Hi tstack , Thanks for the reply. here actually we are concurrenly hitting same snaplex pipeline multiple times due to which memory slot given to a snaplex is completely occupied by the multiple instances of same pipeline execution and due to which the subsequent rest calls are queued as we can see from dashboard that is why it is taking so much time. we are hitting OnPremise url Triggered task showing wrong error message! We have one pipeline which is used in a triggered task. But the problem here is we are concurrently hitting this task with some JSON data which is a valid JSON, and some of them are processed fine by the task and some of them failed with the following error here mapper 1 is the starting snap of the pipeline and you can see the input doc is also zero. We have even created 2 nodes which are running the same snaplex and implemented load balancing Following is the Information of one of the nodes running So we just want to know why that error had come which was completely irrelevant and if it is showing that error, does that mean our concurrent thread counts are way more than normal count and so many executions are queued and because of that other executions are failed without event executing a single snap in a pipeline. Breaking execution based on a condition through child pipeline I want to process multiple documents through one of my child pipeline. However, i want the processing to stop for any of the documents further; if any one of it fails to pass a certain necessary condition. this is my json data which contains Success and Failed values now while iterating on this document if pipeline execution come across Failed value, I want my execution to halt but pipeline execute snap is giving following output the output should have contained only the first document as second document was failed please find attachments for pipelines CHILD_PIPELINE.slp (13.8 KB) PARENT_PIPELINE.slp (5.2 KB) Basically, the requirement is if I am processing an array of data through pipeline execute and I come across a data where I want to put some flag or something in pipeline execute so that consecutive data will see that flag and their execution would not be started.