06-24-2024 10:17 AM - edited 06-24-2024 10:39 AM
Hello,
I need to write files into S3 by keeping file Size(<20mb) ,rather than giving number of records batch limit using pipeline execute. My input is dynamic sometimes 1k records write file >20mb , sometimes 10k records is <20mb. We also have batches> 100k records as inputs. So how can we write files based on Size.
thanks in advance.
06-25-2024 02:04 AM - edited 06-25-2024 02:06 AM
Hi @userg ,
Here's a sample pipeline that splits the source file into multiple 1kb file
Hope this helps.