cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Parser issue with size

ryagnik
New Contributor

I am trying to parse 600MB files using json parser, file reader reads the file but json parser runs into following error -

Failure: Cannot parse JSON data, Reason: Exception while reading document from the stream, SLDB does not support data larger than 100 MB, Resolution: Please check for correct input JSON data.

What is the best practice here to parse large json file ? what is the role of SLDB here ? What is being stored in SLDB here ? even in the architecture document, there is no mention of SLDB here.

Can some one clarify the details here ?

8 REPLIES 8

please note that this only happens in preview mode, I was able to execute the pipeline with large file.

Please note that with large files, I canโ€™t validate the pipeline but I can execute it.

So I think it is a defect which does not allow us to validate the pipeline with large file.

aleung
Contributor III

Ah ic โ€ฆ in that case you can make your File name in File Reader a variable to read from โ€œPipeline parameterโ€. Have a smaller file hard coded in there but then place your normal large file in the Task setting.

ryagnik
New Contributor

thanks โ€ฆI would still consider a defect on snaplogic end.sometimes it is desirable to unit test large files and in this case, validation would not allow us to pass further.