I’m currently trying to build an integration where I need to take a large volume of files (photos) and zip them up into separate zips that aren’t larger than a certain size (as the destination has a limit on how large the zip files can be).
I currently have a pipeline that reads the files, adds up the sizes of each of the files + generates a new ‘file name’ when the summed size hits a certain point and it works when I just write out to folders (using the file name as a folder name) with a document to binary + file writer.
The issue is when I switch it from a file writer to a zipfile writer all of the files get added to the first zip file, even tho the zipfile name has the new zip file directory name in it. Has anyone dealt with this and is there a solution? I’d rather now make a bunch of routers as there’s not a set number of files (as it all depends on how big the photo size is).
The filename field on the zipfile writer I’m using is:
‘file:///’ + _directory_path + ‘Output\’ + $file_name + ‘.zip’
where file name is the variable zip file name (so zip1,2,3)