"JSON Splitter" snap truncate data!

Is there any limitation of data size that “JSON Splitter” snap can handle? I get a 36mb json as input and output less than 1mb. I noticed “JSON Splitter” simply truncates the data!!! Ideally it should not but it is happening!!! Anyone find such odd behavior ? Unfortunately I have to split the JSON to process each sub-element. For smaller size of source file, it works fine though.

Could you share the pipeline or some screenshots of it? It is possible that the JSON Splitter has less output than input - see the first example in the docs, http://doc.snaplogic.com/com-snaplogic-snaps-transform-jsonsplitter_1 , but I’d like to investigate further.

Thanks,

Shayne

At the outset, apologies if it not allowed to start discussion on an old thread as per the rules of the community.

I am facing a similar situation as mentioned in the post. I have a json file that has weather details of close to 100 countries. But, when I use a JSON Splitter snap, the output is reduced to only 50 documents. Ideally, I should have close to 100 countries (equivalent to the number in my input).

Attaching a snapshot of my pipeline for your reference

This is the output of RestGet node having close to 100 entries

Once it cross the JSON Splitter snap, the number of documents are reduced to 50

I tried various options. Is there any limitation at the Snap level?

Thanks,
Sarath Kallayil

Guys … I figured this out. I was observing this as I was viewing the data with the pipeline validation. In that case, number of records are limited to 50. However, when I created a task and executed the pipeline, I am getting expected results