โ08-19-2020 08:41 PM
Iโm trying to produce output containing Value and Item ID from separate files.
I have two input files for this, with different naming for the fields For example,
Transaction Name=TRANSACTION
First Name=FNAME
LastName=LNAME
In addition, one file contains the id, the other contains the value. I need to combine these into an output with Instance (text value) and the ID value. I have a gate setup to create a doc with input0 and input1 as shown below.
Input0
โInstanceโ:โTransaction Nameโ
โIDโ:โbae77โ
Input1
โTRANSACTIONโ:โAC00014623โ
What I want this to look like is:
{TextValue":โAC00014623โ
โItemโ:{โidโ:โbae77โ}
I have about 40 of these pairs, slightly different in each file.
Any recommendations or ideas?
Solved! Go to Solution.
โ08-27-2020 02:27 PM
@acesario I think what Iโm understanding is that one of your files might have an array with more than one element, but the default parsing turns each element into a separate output document. You can change that by setting โProcess arrayโ to false on the JSON Parser. That will produce a single output document whose data is the full array. Before you can feed that into the Join, youโll have to use a Mapper to map the array to the field of an object. Iโve attached a sample pipeline.
Community 8107_2020_08_27.slp (9.8 KB)
Here are the inputsโฆ
fileA.json:
[
{
"file": "A"
}
]
fileB.json:
[
{
"file": "B",
"id": 1
},
{
"file": "B",
"id": 2
}
]
Output of the Join:
[
{
"fileA": {
"file": "A"
},
"fileB": [
{
"file": "B",
"id": 1
},
{
"file": "B",
"id": 2
}
]
}
]
That combines all of the data from both input files into a single document. You can modify the Mappers to move things to the right places.
โ08-26-2020 03:40 PM
@ptaylor Apologies that I only put the input files in the attachment. Here they are for your reference:
Input file 1
[{โTRANSACTIONโ: โAC00014623โ,
โFIRST_NAMEโ: โJorgeโ,
โLAST_NAMEโ: โWhoeverโ}]
Input file 2:
[{โTRANSACTION_WIDโ: โbae771d54a9d018123ac687fea12800cโ,
โFIRST_NAME_WIDโ: โbae771d54a9d01f0a3b4687fea12810cโ,
โLAST_NAME_WIDโ: โbae771d54a9d011c59ba687fea12820cโ
}]
The use case at this point is basically an APPEND the WID values to each input1 document. File 2 does not change. File 1 may have many rows.
Note: I was able to accomplish this via hard coding the values to variables in a mapper, but the reference idโs may change from time to time, and it is better practice to pull from a file, or web service call. I hope this helps explain what Iโm trying to accomplish.
โ08-27-2020 11:09 AM
Sorry, Iโm still not following. There are no values in common between the two files, so how would you expect a join to work?
โ08-27-2020 11:32 AM
@ptaylor Correct. There are currently no matched fields. In other tools, I have been able to append one file with another.
โ08-27-2020 12:05 PM
@acesario - From what you are describing, I think @pataylor provided the correct solution with the Join snap using a 1=1 condition. Assuming that file 1 has an arbitrary number of records, and file 2 has only 1 record, it would successfully append the fields in file 2 for each record in file 1. This would match what you did with the hardcoded values in the Mapper that you describe earlier.
โ08-27-2020 02:27 PM
@acesario I think what Iโm understanding is that one of your files might have an array with more than one element, but the default parsing turns each element into a separate output document. You can change that by setting โProcess arrayโ to false on the JSON Parser. That will produce a single output document whose data is the full array. Before you can feed that into the Join, youโll have to use a Mapper to map the array to the field of an object. Iโve attached a sample pipeline.
Community 8107_2020_08_27.slp (9.8 KB)
Here are the inputsโฆ
fileA.json:
[
{
"file": "A"
}
]
fileB.json:
[
{
"file": "B",
"id": 1
},
{
"file": "B",
"id": 2
}
]
Output of the Join:
[
{
"fileA": {
"file": "A"
},
"fileB": [
{
"file": "B",
"id": 1
},
{
"file": "B",
"id": 2
}
]
}
]
That combines all of the data from both input files into a single document. You can modify the Mappers to move things to the right places.