10-02-2017 02:04 PM
I am trying to parse 600MB files using json parser, file reader reads the file but json parser runs into following error -
Failure: Cannot parse JSON data, Reason: Exception while reading document from the stream, SLDB does not support data larger than 100 MB, Resolution: Please check for correct input JSON data.
What is the best practice here to parse large json file ? what is the role of SLDB here ? What is being stored in SLDB here ? even in the architecture document, there is no mention of SLDB here.
Can some one clarify the details here ?
10-03-2017 07:55 AM
please note that this only happens in preview mode, I was able to execute the pipeline with large file.
10-03-2017 08:27 AM
Please note that with large files, I can’t validate the pipeline but I can execute it.
So I think it is a defect which does not allow us to validate the pipeline with large file.
10-03-2017 10:29 AM
Ah ic … in that case you can make your File name in File Reader a variable to read from “Pipeline parameter”. Have a smaller file hard coded in there but then place your normal large file in the Task setting.
10-04-2017 08:06 AM
thanks …I would still consider a defect on snaplogic end.sometimes it is desirable to unit test large files and in this case, validation would not allow us to pass further.