02-08-2022 12:18 AM
Hi Team,
I have a requirement where in I have to read the files from SFTP and write the files to Blob storage. This has to be done every 15 minutes. How can I ensure I am not writing any duplicate files to the blob?
Regards
Sindhu
Solved! Go to Solution.
02-15-2022 06:56 AM
Yes, you can achieve it with manipulating the file in sldb, but it is easier if you have an access to database.
You can create a file in sldb and append data to it once a file is processed.
BR,
Marjan
02-08-2022 05:30 AM
Hi @sindhu ,
You can move the original file after it has been written to Blob storage in another folder, for example “Archive”. Or, you can rename the original file after it has been written to Blob storage.
BR,
Marjan
02-15-2022 05:45 AM
Hi @Marjan,
Thanks for the response.
We should not rename the file or move them to different folder since the same SFTP path is being used by multiple applications.
Regards
Sindhu
02-15-2022 06:18 AM
Hi @sindhu ,
What about using a database, where you will write a record for every file that has been processed successfully (for example you will write the filename) ?
Then, you can get all the rows from that table and before processing a new file from sftp, you will check if the file has already been processed (if it exists in the table). If not, then proceed with it.
BR,
Marjan
02-15-2022 06:53 AM
That would be a solution. We are currently not using database in our account. We will have to get a new one created for this purpose.
Writing the last processed file timestamp to a file in sldb and reading that timestamp before I start processing the file and finally updating the file once it is processed. Can this be a solution? I am thinking of this but not yet implemented.