01-25-2021 10:00 AM
Hi, here’s my problem:
I make a call to an API which gets 10 items of data. I then need to call a separate API (REST Post) 10 times, one for each unique item of data returned from the original call. These 10 calls would ideally be run in parallel from a performance perspective.
What’s the best way to achieve this? Because pipeline parameters aren’t passed down the line, the only way I can envisage I could make this work would be to put the entire 10 calls in their own pipeline, use a Pipeline Exec and pass the response from the original call as a pipeline parameter which the 10 calls could then access. But this will still mean calling them in series. I’m pretty new to Snaplogic, and before I build this approach wanted to see if there are better approaches that I’ve missed?
Thanks in advance.
01-25-2021 11:58 AM
Can you use a JSON Splitter after your initial call, resulting in 10 documents, which you then pass into a Pipeline Execute with a pool size of 10?
01-28-2021 11:27 AM
Thanks for that, your reply helped solve another problem I had, I’ll get back when I’ve confirmed its also a solution for this issue.
05-24-2022 04:35 PM
Hi,
I have a scenario, where the api post will only take 100 records per minute.
My requirement is to post 100 records a batch with a wait time of 1 minute between each post.
05-25-2022 10:30 AM
Hi @karthik_dhina,
You can acheive this with Script snap.
Inside which you can make the snap to wait for some time.
Here is one mine post in which you can find the script.
Script is written in python and simply just calls sleep function.
https://community.snaplogic.com/t/throttle-bot-or-other-loop-delay-mechanism/11423/2?u=viktor_n
Hope this will help.
Regards,
Viktor