This website uses Cookies. Click Accept to agree to our website's cookie use as described in our Privacy Policy. Click Preferences to customize your cookie settings.
Hi Team,
I am trying to achieve below. I would need directions on how to achieve below in a mapper or any other way.
Source :
{
“UBER_ID”:[
“1”,
“Integer_pattern”
],
“First_name”:[
“Majid”,
“TextOnly_pattern”
],
“Last_name”:[
“”,
“TextOnly_pattern”
]...
Hi Team,
I would like some recommendations/directions regarding possible solution to below requirement.
I have created a generic pipeline to read 800 different Mainframe VSAM binary files with different formats and parse them using associated copyboo...
Hi All,
I am looking for directions on how to perform a join based on between clause. I would like to join these 2 sources based on Source1.Key between source2.Key Start and source2.Key End
Source 1 :
|Key|Name|
|0|ABC0|
|1|ABC1|
|2|ABC2|
|3|ABC3|
|4...
Hi Team,
We have a requirement to cache credentials for specific time so that different pipelines executing with in the specific time window will not request AWS for new credentials each time we execute a new pipeline.
Currently at the beginning of e...
Hello,
I have a requirement of reading incremental data from source database and apply required transformation and create target files in PDF format. Has anyone implemented any pipeline and generated PDF file as target.
Thank You,
@del Thank You so much for all your help. I think this is better solution and I will not need pivot and even data validator if I go with this solution. This will also make the pipeline more dynamic. Any new regex I need to add I will add in expressio...
@del Thank you so much… this is what I was exactly looking for. I will verify the performance and update you…
Can you provide example if possible about the way it can be done with expression library. As data validator does not allow any expression li...
Thank You @del … This works but in my case the number and name of fields will be different from one file to other as I am creating a generic pipeline. Is there anyway to make the Pivot snap number of fields and field names dynamic.
I am also not sure...
@cjhoward18 Thank you. This is renaming the arrays but as there is more than one fields in the object with same key name the output is just keeping the last instance.
@bojanvelevski Thank You. my requirement is to validate each field against differen...