Forum Discussion
May I suggest then if you have an issue with writing documents to local or SLDB to write to a temporary file name not indicative of the contents and build into your pipeline the ability to clean that temp file afterwards. That way you can function with the REST post snap in multipart mode and not leave artifacts remaining on your filesystem whatever that may be.
When you use postman or cURL or anything really to make a multipart post you stream from a file local to the client making the connection. That’s really the only sustainable way to multipart post files that are of nontrivial sizes. The snap is no different, it needs to stream from a file, and its context of local is SLDB or on the node filesystem. Because the snap only takes one set of credentials (which are for the post target), its not possible for it to reach out unless the URI you point it as has no authentication in place.
The reason binary rest post snap is not really feasible would be you loose all of your other data in the document. When you have a binary output in SnapLogic you only have that binary stream and the headers describing that stream. You’d lose any other data fields you’d want in that document and then it couldn’t be used to configure the post snap, meaning all your setup would have to be static (which is not great )or need to be parametrized via pipeline parameters (which is also not very flexible).
if you’d like to send me an email at dwhite@snaplogic.com, I can share a script I’ve written which does multipart posting and does download to local file system, upload as multipart (both file and any form data you want), and cleanup. The caveat again here being that the callout for the download must be no auth, but maybe you could co-opt it for your own designs.
@ dwhite@snaplogic.com Can you please share script / pipeline for this?