cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Generic Pipeline

Swatisree
New Contributor

How to read multiple files with different structure and use a generic pipeline to load them into tables based on parameter?
I can use the parameter to load them into tables based on respective file names. But I am facing challenge in reading different structure in a same file reader.
Can anyone suggest some option?

5 REPLIES 5

cstewart
Former Employee

The best approach here would be to use a child pipeline to execute on each file to be read. That way the pipeline is only ever dealing with a single structure.
Where is this parameter coming from? Can you describe the scenario a little more?

Swatisree
New Contributor

Thank you Craig, for your response!

The parameter is pipeline parameter to pass the table name and I will put file names accordingly so that they can be identified with single param.
But the metadata for different files would be different, can I use schema files in that case? If yes, how is it mapped in a file reader or any other snap? Or is it not feasible in snaplogic?

In SnapLogic the File Reader just reads the binary stream. After the stream has started, this is passed to the next Snap, often a parser. Another option might be to do a file reader, followed by a Binary Router, in which you use an expression (based on pipeline parameter, file name or something) to route the input to a parser (and the downstream) to load into the various different tables. How many different formats are you potentially routing to?

Hi Swatisree,

We have implemented the solution for one of our client. and we process million of record each day. Here is the solution we design-

  1. A generic pipeline which has only one oracle upsert/insert snap and exposes it as trigger task. You should parameterize table name/ column name etc.
  2. One pipeline which will read file dynamically (based on the param), then use binary router to identify the table name, use parser as required, then use mapper if you want to transform anything and make sure the target field name should be the same name as your tableโ€™s column name, then call the trigger task exposes in step1 and passed param (table name/schema name etc)
  3. Pass table name and file name to Step-2.

Hope it helps.