cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

SFTP to blob storage

sindhu
New Contributor II

Hi Team,

I have a requirement where in I have to read the files from SFTP and write the files to Blob storage. This has to be done every 15 minutes. How can I ensure I am not writing any duplicate files to the blob?

Regards
Sindhu

1 ACCEPTED SOLUTION

Yes, you can achieve it with manipulating the file in sldb, but it is easier if you have an access to database.

You can create a file in sldb and append data to it once a file is processed.

BR,
Marjan

View solution in original post

6 REPLIES 6

marjan_karafilo
Contributor

Hi @sindhu ,

You can move the original file after it has been written to Blob storage in another folder, for example โ€œArchiveโ€. Or, you can rename the original file after it has been written to Blob storage.

BR,
Marjan

Hi @Marjan,

Thanks for the response.

We should not rename the file or move them to different folder since the same SFTP path is being used by multiple applications.

Regards
Sindhu

Hi @sindhu ,

What about using a database, where you will write a record for every file that has been processed successfully (for example you will write the filename) ?
Then, you can get all the rows from that table and before processing a new file from sftp, you will check if the file has already been processed (if it exists in the table). If not, then proceed with it.

BR,
Marjan

That would be a solution. We are currently not using database in our account. We will have to get a new one created for this purpose.

Writing the last processed file timestamp to a file in sldb and reading that timestamp before I start processing the file and finally updating the file once it is processed. Can this be a solution? I am thinking of this but not yet implemented.