Create a reusable pipeline to dynamic apply default value

Here is the scenario:
I want to apply default value to a table, and the column(s) which need to apply default value might be changed. What I am doing is to:

  1. Create a parameter csv file with the same column names as the source table.
  2. Join the source table with the csv file on 1=1, then the csv columns will get a prefix ‘input1_’.
  3. By using hasOwnProperty , it will be able to apply the default value from the csv file to the source table. e.g. $.hasOwnProperty('input1_COUNTRYCODE')?$input1_COUNTRYCODE: $COUNTRYCODE to apply the default value for [COUNTRYCODE] column.

My question is, if there is 100 columns in the source table, I need to create 100 hasOwnProperty in the mapper and if the column names in the source table changed I have to modify those expressions, so, is it any expression could get the field names dynamic to replace this COUNTRYCODE in the expression?

Source table:

    "BANKNAME": "Test Bank 001",
    "BANKNUMBER": "Test123",
    "BANKNAME": "Test Bank 002",
    "BANKNUMBER": "Test456",

Parameter file (default value):


Hey @Dai9210,

Here’s one solution for you:

Default Values_2021_09_27.slp (8.6 KB)

The pipeline sends all records in an array. That array is joined with the default values 1=1. Right after, follows a mapper with expression that maps the $group array, and within that map, there’s a nested .mapValues() function that checks if the key is in the incoming document (apart from the $group). If it is map Its value, if not , keep the actual one. After the mapping is done, jsonSplitter will split the $group.



It works!
Thanks very much! @bojanvelevski

1 Like