cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Need to chunk files and hit the api for each record

reachsenthil
New Contributor

Need to chunk files and hit the api for each record. Restriction on api to hit only 1000 records per request. The input has millions of records. how to handle this in same pipeline

3 REPLIES 3

Abhishek_Soni37
Contributor

Hi @reachsenthil

Assuming you are using Rest Post to send data to the API, you can define the batch size.
Have a look at Rest POST.

image

Hi @Soni37 , I am using REST GET to hit the url with parameters and get the response and load to table. so the restriction is to hit api for 1000 records per request

You can use Group By N snap to create a batch of requests before feeding it to REST Get snap.
Test_Expression_2023_06_22.slp (3.8 KB)

The response will look like this: In my case, I created a group of 8 records.
image

Let me know if it works. ๐Ÿ™‚