cancel
Showing results for 
Search instead for 
Did you mean: 

How to copy logs from /opt/snaplogic/run/log through a pipeline without losing data or data duplication?

smit66
New Contributor II

Hello,

I want to move logs of a snaplex’s node from /opt/snaplogic/run/log to a target application. For that I want to use a SL pipeline.
Logs gets updated based on file size.

for example, when jcc.json file reaches 100 mb size, the file gets transferred to backup folder. backup can hold up to 10 jcc.json files. After it reaches 10 file mark, It will delete the oldest file from the backup folder.

Question : How do I approach copying logs from backup folder with a SL pipeline, so that we don’t lose logs or have duplicate logs.

2 REPLIES 2

alchemiz
Contributor III

Hi Smitkumar,

Good day, base from the scenario that you post it looks like you need a file watcher in the pipelines view there’s no such thing unless you schedule the pipeline to run every 15 secs

A quick snaps that comes to my head is use a directory browser snap, copy snap, sort snap and a file operation snap and a delete snap for the 10 file mark depending where you are comfortable you can add a sequence snap

first line of pipeline is the scenario (1)

Directory browser to browse the log source path use a filter then filter only the files with size >= 100MB then feed the path to file operation use move action so that it will also be deleted from the source dir

second line of pipeline is the scenario (2)
Directory browser to browse the target path put a sort snap sort it by update date then the sequence snap to label the files then filter snap filter the files that are labelled > 10 then delete snap

🙂 hope this helps

smit66
New Contributor II

Hi Mike,

Thanks for your suggestions. It definitely helped.
Smit