cancel
Showing results for 
Search instead for 
Did you mean: 

"JSON Splitter" snap truncate data!

Abhijit
New Contributor

Is there any limitation of data size that “JSON Splitter” snap can handle? I get a 36mb json as input and output less than 1mb. I noticed “JSON Splitter” simply truncates the data!!! Ideally it should not but it is happening!!! Anyone find such odd behavior ? Unfortunately I have to split the JSON to process each sub-element. For smaller size of source file, it works fine though.

4 REPLIES 4

shodge
Former Employee

Could you share the pipeline or some screenshots of it? It is possible that the JSON Splitter has less output than input - see the first example in the docs, http://doc.snaplogic.com/com-snaplogic-snaps-transform-jsonsplitter_1 , but I’d like to investigate further.

Thanks,

Shayne

At the outset, apologies if it not allowed to start discussion on an old thread as per the rules of the community.

I am facing a similar situation as mentioned in the post. I have a json file that has weather details of close to 100 countries. But, when I use a JSON Splitter snap, the output is reduced to only 50 documents. Ideally, I should have close to 100 countries (equivalent to the number in my input).

Attaching a snapshot of my pipeline for your reference

image

This is the output of RestGet node having close to 100 entries

image

Once it cross the JSON Splitter snap, the number of documents are reduced to 50

image

I tried various options. Is there any limitation at the Snap level?

Thanks,
Sarath Kallayil

sarathmattam
New Contributor

Guys … I figured this out. I was observing this as I was viewing the data with the pipeline validation. In that case, number of records are limited to 50. However, when I created a task and executed the pipeline, I am getting expected results

scotth
New Contributor II

Thanks for this post, I got quite concerned when I only saw the 50 records returned on ‘validation’! As mentioned the secret is to Execute (a schedule task is optional).