cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Changing the key names dynamically

sravan
New Contributor II

Hi,

I have a requirement where we need to read 50 tables from oracle and put the data to postgres. there is no transformation logic in between . But issue is while loading to target the column names are slightly changing. for an instance , 

src_table1

{

"id" : 1,

"name" : "abc",

"del_ind" true

}

While loading the above data to target i need to change the column names . The above json keys needs to be changed while loading to target as target table column names are different. 

src_col    corresponding_tgt_col

id                          id

name                   full_name

del_ind               delete_indicator

 

 

Like this I have 50 tables i cant create 50 mappers to map the column names for each table. I need to have a expression language to change the field names accordingly .

 

right now i am maintaining a config file which contains all the information to parameterize everything.

src tblename, tgt table name ,src schema, tgt schema,srcaccount,tgt account everything comes from my parent pipeline. all set but, while changing the column names I am struggling to write a expression . 

 

please help!

1 ACCEPTED SOLUTION

koryknick
Employee
Employee

@sravan - I'm sure others have their own ways to do things like this, but I'm partial to expression libraries, especially when you have multiple definitions to work with.  Download the attached zip file and import the pipeline and file to your project.

In the file, you will notice that it is named based on the table name, in this case "test_table", and the contents have an "xref" object defined with the source key names, and the desired key names.

In the pipeline, the Mapper is using the Object.mapKeys() method to translate the key names by doing an Object.get() call to ensure that the source key name has been defined in the expression file and will default to the source key name from the input view if it doesn't exist. 

Also in the pipeline is a dynamic reference to the library expression file based on the table_name pipeline parameter value, and aliased as "tbldef" so it can be referenced in an expression consistently regardless of the actual table name being processed.  This will allow you to have one definition file per table, making it easier to maintain.  Assuming you are using a parent pipeline to call a child and passing the name of the table to be processed as a pipeline parameter, this should be easy for you to incorporate.

Hope this helps!

 

View solution in original post

2 REPLIES 2

koryknick
Employee
Employee

@sravan - I'm sure others have their own ways to do things like this, but I'm partial to expression libraries, especially when you have multiple definitions to work with.  Download the attached zip file and import the pipeline and file to your project.

In the file, you will notice that it is named based on the table name, in this case "test_table", and the contents have an "xref" object defined with the source key names, and the desired key names.

In the pipeline, the Mapper is using the Object.mapKeys() method to translate the key names by doing an Object.get() call to ensure that the source key name has been defined in the expression file and will default to the source key name from the input view if it doesn't exist. 

Also in the pipeline is a dynamic reference to the library expression file based on the table_name pipeline parameter value, and aliased as "tbldef" so it can be referenced in an expression consistently regardless of the actual table name being processed.  This will allow you to have one definition file per table, making it easier to maintain.  Assuming you are using a parent pipeline to call a child and passing the name of the table to be processed as a pipeline parameter, this should be easy for you to incorporate.

Hope this helps!

 

sravan
New Contributor II

Thank you @koryknick  that works.