cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Creating output file names dynamically for each SQL Server table data stream taken as input from configuration File

rakesh_2004
New Contributor

I am trying to create a pipeline which will take a configuration file as an input and it will have Table name and column names that i will need for that table. You can put any no of tables and for each table no of column selected will be different. I want to write the data in output file and output file will be created for each table separate with naming pattern out_.csv. I used a SQL execute snap to get the data and pass to target but not able to create separate files. All data are getting appended in one file and header row is coming only for the first table. Any suggestions on how this can be done? Please feel free to ask question regarding the requirement. This will help to do one time loads from a DB to a file system and i will be able to control the table names and column names on the fly.

8 REPLIES 8

christwr
Contributor III

Once you โ€œgo binaryโ€ (triangle shape output) in the CSV Formatter, there is no data retained other than the binary data itself. So you donโ€™t have the TableName anymore. You can still reference values that were passed into the pipeline as a parameters though (_TableName). So if you call a child pipeline, passing the table name as a pipeline parameter, then the child pipeline can use that parameter in the downstream File Writer.

Ok. Any way you assign a parameter in the same pipeline and access it from file writer snap?

christwr
Contributor III

You canโ€™t change a pipeline parameter within the pipeline. You might be able to pass a value through as a binary header in the CSV Formatter and then reference it downstream, but that might also make your file a little wonky, not sure. Calling child pipelines is pretty straightforward though, as this is a typical reason to do so.

Thanks for the detailed inputs. Certainly appreciate that everyone.