I have a pipeline that writes a file to an FTP server. The pipeline consists of a file reader ->Binary router-> File Writer. The writer is configured with an FTP account.
There is a separate pipeline that reads all the files from a particular location , processes them and calls the above pipeline to write the file to the ftp location one at a time.
This works fine when the number of input files is less . But it it is close 500 then it comes back with an error. We have noticed that the maximum number of concurrent connection supported by the FTP server is 500.
It looks like when the child pipeline ( that writes the file to the ftp location) completes it seems it does not release the ftp connection until the parent pipeline terminates.
Is this a fact. Even though we are making sure that we are writing the files sequentially but since the connection is not closed , it is giving an error.
Is there something we can do to ensure the FTP connection is closed?