Forum Discussion
Is that your assumption of this?
What happens is that the incoming documents are sent in round robin to this
snap even if you have the pool size enabled. So they will never be sent to
two instances of execution.
Unless you have duplicates in your data you will be safe with the pool and
achieve true concurrency.
- RogerSramkoski3 years agoEmployee
Hi neelamanikandan, thank you for joining the SnapLogic Community within the last few weeks and for sharing how you split multiple array objects. I just want to double check, did you have a question about how you split the JSON arrays or were you just sharing a solution?
- neelamanikandan3 years agoNew Contributor II
Hi RogerSramkoski It was a question, in the above example I had 2 (Servicecompany and Servicecompanydetail) fields that had values in array, so I have used 2 JSON splitters in the pipeline. But if my input JSON contains multiple(eg. 10) array fields should I always use multiple (10 JSON splitters) in the pipeline? Is there alternate Snap that can flatten all the Array elements at one go, or is there an alternative approach to use the JSON splitter to flatten all arrays
- ssapa3 years agoEmployee
neelamanikandan May I know what is you expected output?
- neelamanikandan3 years agoNew Contributor II
Vehicle_Id InceptionDt @_Id $ ServiceCompanyAddressLine1 ServiceCompanyPhone ServiceCompanyPostalCode _Id4 08/03/2023 _Id5 ServiceCompany45 ServiceCompanyAddressLine150 ServiceCompanyPhone26 ServiceCompanyPostalCode2 _Id4 08/03/2023 _Id5 ServiceCompany45 ServiceCompanyAddressLine163 ServiceCompanyPhone9 ServiceCompanyPostalCode15 _Id4 08/03/2023 _Id6 ServiceCompany47 ServiceCompanyAddressLine150 ServiceCompanyPhone26 ServiceCompanyPostalCode2 _Id4 08/03/2023 _Id6 ServiceCompany47 ServiceCompanyAddressLine163 ServiceCompanyPhone9 ServiceCompanyPostalCode15
- alchemiz3 years agoContributor III
Hi neelamanikandan ,
Good day, see if this will work place the expression in a mapper
sl.ensureArray($ServiceCompanyArray).map(c=> sl.ensureArray($ServiceCompanyDetailArray).map(d=> {'Vehicle_Id': $Vehicle_Id, 'InceptionDt': $InceptionDt}.merge(c.merge(d)))).reduce((a,b)=> a.concat(b),[])Thanks,
EmEm
- neelamanikandan3 years agoNew Contributor II
This works Thank You!
But this will also be a too much of coding when we have multiple attributes, and multiple arrays. Anyways thanks for this solution, we can take it from here.
- walkerline1178 years agoContributor
Thats what im trying to say, we have the same/duplicates in my data…
in my case, its the firm employee data where the location of the employee is one of the column in the row(document)
e.g.
1, Emplyee1, New York
2. Emloyee2, New YorkSo the sub-pipeline is to check if the location of that Employee is in the DB first, if its in DB, it will update the location, if its not in DB, it will insert the location into DB.
In above case, because the location in the above 2 document are same. Lets say the database does not have New York initially. I think with concurrent or even round robin…if for the first snap execution there’s a delay to insert New York into DB that the insert is NOT able to happen before the 2nd pipeline execution doing database check, the 2nd execution would insert the same into DB.
Thanks
- dwhite8 years agoEmployee
If its the same data and you’re doing an update, it seems like it’d be a non issue.
I’d like to ask why understand your use case more, You have a child pipeline running to do individual updates to a database?That’d eat a lot of resources spinning up and down those parallel executions, even if you reuse. From what you’ve described here it sounds like you’re injecting parallelism just for the sake of it.
It’d most likely be more efficient and performant to use a single db snap to do your updates and edit the batch settings on your db account.
Also you should not think of pipelines as threads, but a process that’s a collection of threads. In reality, each snap in a pipeline has its own thread. Those snaps combine to perform a semi-serial execution, and pipeline executions are generally not aware of each.
Best,
- walkerline1178 years agoContributor
unfortunately to insert the data in the db, its not via SQL, we have to call SOAP API to insert/update single location into that systems’DB.