We want to use the SQL Server Bulk Load Snap to load large files (50+MM rows). It appears that the temp file used by bcp is written to memory as we consume all of the memory on the host. We’re looking for alternatives to loading the large files. We tried using a Group By N snap to group the records into batches and then pass the batch to a child pipeline via Pipeline Execute snap, where the child pipeline would execute the SQL Server Bulk Load Snap, but the max limit on the Group By N snap is 10,000 and the result is many instances of the child pipeline. This also consumed all of the memory on the host. Has anyone successfully solved this problem?