How to audit payload in pipeline?

Hi all,

I’m new to SnapLogic world with past working experience in traditional ESB products like IBM Message Broker, WebMethods ESB, TIBCO etc. Wanted to understand if SnapLogic provides any out of the box feature to capture the input & output document for auditing and logging purpose (both in case of synchronous and async scenario). In production environment captured input/out payload will come handy to conduct RCA of a failed interface pipeline.

I’ve checked the public API’s shared by SnapLogic, but couldn’t find any hint to capture the actual and transformed payload to audit entry and exit point.

https://doc.snaplogic.com/wiki/display/SD/Public+API+Documentation

Thanks in advance.

Best,
Yashu Vyas

1 Like

Hi,

You’re right, the Public API just provides access to the log data captured in Dashboard, which doesn’t include any document data from the pipelines.

One of our Snaplogic users solved this by creating a re-usable ‘Logger’ Pipeline which is invoked via a Pipeline Execute at the relevant points within their Pipelines. The Logger is very lightweight and captures some basic document data and pushes it to our centralised logging, in our case LogStash.

However we need to be careful how much we use it, particularly in production, as it adds the overhead of an additional pipeline execution every time. At the very least it should be designed so that you can ‘Reuse Executions’. We also don’t consider it good practice to capture significant amounts of document data in our logs as:

a) it will fill up log storage unnecessarily, when for most successful transactions we won’t ever look at it
b) it presents a security / privacy risk if any sensitive business data is included (e.g. employee date of birth) because logs are usually not as tightly secured

Generally we’ll just log the unique identifiers (e.g. employee ID) or if you’re using one, correlation ID.

Cheers,
C.J.

2 Likes

Thanks a lot CJ for your quick response.

I understood, as SnapLogic is more of a data streaming platform, the underlying philosophy doesn’t support capturing document data for auditing purpose.

In our project we’re prepping for a large implementation with around 700 + integration scenarios, which includes most of the business critical processes for the enterprise.

Depending on our typical requirements, if we still have to capture the incoming/outgoing document data, can you please suggest if for internal document processing SnapLogic follows any specific message(document) format from where document data can be accessed or using Copy snap is the only possible solution.

Regards,
Yashu Vyas

hi Yashu -

Even in other platforms that I have worked it is not recommended to log everything that comes your way. It just is a waste of space and resources. There are usually 3 levels of logging - info, warn, error

What I would recommend is to have a debug parameter and if the value equals Y then do the logging like CJ mentioned at important steps.

Doing this debug parameter will ensure that you are not capturing these logs in production once you set them to N.

Hope that makes sense.

Thanks Naveen for sharing your perspective. I completely agree it’s not advisable to log each and every pipeline step as it’ll be an overhead.

My query is more to do with the auditing part of it where we want to capture the document payload at entry & exit point. Do we’ve some OOTB mechanism in SnapLogic to capture that or we’ll have to leverage Copy snap and submit it to audit pipeline for subsequent processing.

Hope the requirement is more clear now.

Regards,
Yashu Vyas

i have the same requirement … did u find any solution ?