Is there a limit on the size of a JSON document that SnapLogic supports?
Are you asking about the document in memory or when it’s read-from or written-to disk?
In memory, the document is made up of Java objects and I don’t think we add any additional limits to what is available in native Java and how much memory the node has. One limit that comes to mind is the size of a byte array/string must be less than 2GB.
I don’t think there are any limits when serializing/deserializing.
Why are you asking?
One thing that I COULD add is that there IS a limit somewhere on preview mode. It could be on the read, or internal, but it is there, at least in the snaplexes I have used on the cloud. I guess they figure there is no reason to have a test larger than that. So you have to make the test data sets relatively small. If you actually run it as a background task, or scheduled, there is not any small limit. I have run what I would consider large data sets, with some large JSON records. So if you are asking because it isn’t running in preview mode, there is your answer. BTW the problem I described can also give WEIRD errors that may cause you to search for a red herring. In such a case, try running it outside of preview mode. If it runs, it was probably because the dataset was too big for preview mode.