Script Snap Limit of 50 Documents?

I’m working on transforming some data that’s coming in from a REST GET Snap. The format of the data that is being returned is quite disjointed. Essentially, the headers come in one of the JSON elements, and the data comes in another set. I have to marry the two together in SnapLogic.

I have successfully written a script that loops through each of the “rows”, and pairs the data with the appropriate header. I have accomplished this in two different ways.

  1. I have created a script that outputs just one document, with an array of the responses.
  2. I have also created another version of the script that outputs each row as its own document.

#2 seems to be the best approach here, as downstream snaps can handle the data without any problems. It looks great. But I’m running into a limitation that doesn’t make sense. I can only output 50 documents? If I attempt anything more, the Snap throws an error. I just don’t understand this limitation? What is the correct approach to work on larger datasets if the Script Snap limits me so heavily?

#1 works well too, except for the fact that the Script Snap outputs the rows as an Object, and I can’t use the JSON Splitter Snap to split the object (which is truly just an Array) up.

So I’m kinda stuck here. I’m sure I could devise some tricky workaround to split up the rows, and essentially execute the same Script Snap multiple times, but that seems kludgy at best.

What’s the best practice here, and why am I so limited to outputting only 50 documents?

Thanks in advance.

When you Validate a pipeline instead of Executing it, the number of documents in the preview output of each snap is limited to the Preview Document Count in the User Settings, which defaults to 50. You can increase it up to 2000:

1 Like

@ptaylor Thank you so much! This was exactly what I needed, and it solved my problem perfectly. In the past, I was aware that the preview was essentially truncating the number of results I could see, but I didn’t realize it would impact me in this way.

Thanks again!

Glad to help!