How to best combine/map values from separate inputs?
I’m trying to produce output containing Value and Item ID from separate files.
I have two input files for this, with different naming for the fields For example,
Transaction Name=TRANSACTION
First Name=FNAME
LastName=LNAME
In addition, one file contains the id, the other contains the value. I need to combine these into an output with Instance (text value) and the ID value. I have a gate setup to create a doc with input0 and input1 as shown below.
Input0
“Instance”:“Transaction Name”
“ID”:“bae77”
Input1
“TRANSACTION”:“AC00014623”
What I want this to look like is:
{TextValue":“AC00014623”
“Item”:{“id”:“bae77”}
I have about 40 of these pairs, slightly different in each file.
Any recommendations or ideas?
@acesario I think what I’m understanding is that one of your files might have an array with more than one element, but the default parsing turns each element into a separate output document. You can change that by setting “Process array” to false on the JSON Parser. That will produce a single output document whose data is the full array. Before you can feed that into the Join, you’ll have to use a Mapper to map the array to the field of an object. I’ve attached a sample pipeline.
Community 8107_2020_08_27.slp (9.8 KB)
Here are the inputs…
fileA.json:
[ { "file": "A" } ]
fileB.json:
[ { "file": "B", "id": 1 }, { "file": "B", "id": 2 } ]
Output of the Join:
[ { "fileA": { "file": "A" }, "fileB": [ { "file": "B", "id": 1 }, { "file": "B", "id": 2 } ] } ]
That combines all of the data from both input files into a single document. You can modify the Mappers to move things to the right places.