Monday - last edited Monday
Hello,
I need to write files into S3 by keeping file Size(<20mb) ,rather than giving number of records batch limit using pipeline execute. My input is dynamic sometimes 1k records write file >20mb , sometimes 10k records is <20mb. We also have batches> 100k records as inputs. So how can we write files based on Size.
thanks in advance.
Tuesday - last edited Tuesday
Hi @userg ,
Here's a sample pipeline that splits the source file into multiple 1kb file
Hope this helps.