Remove duplicate values from the JSON array

Hello All,

Hi, I have a JSON array and I have to remove the duplicates based on a field and then I want the non duplicate elements in one array and the duplicate values in an another array.

Input JSON:
[
{“pName”: “abc”,“iNumber”: 123 },
{“pName”: “def”,“iNumber”: 123 },
{“pName”: “xyz”,“iNumber”: 890 },
{“pName”: “jkl”,“iNumber”: 456 }
]

Required Output:
[
{“pName”: “abc”,“iNumber”: 123 },
{“pName”: “def”,“iNumber”: 123 }
]
[
{“pName”: “xyz”,“iNumber”: 890 },
{“pName”: “jkl”,“iNumber”: 456 }
]
Two separate JSON arrays one with non duplicate iNumber and another with the one which were duplicates.

I tried filter((item, pos, a) => a.findIndex(elem => item.iNumber == elem.iNumber) == pos), it didn’t give out the required result it still keeps one of the duplicate values.

Have you tried the Group by Fields Snap?

Try using the filter method

e.g.

{}.merge({“Unique”: $array.filter((a,b,c)=> c.filter((x,y,z)=> a[‘iNumber’] == x[‘iNumber’]).length == 1)}, {“Dups”: $array.filter((a,b,c)=> c.filter((x,y,z)=> a[‘iNumber’] == x[‘iNumber’]).length != 1)})

1 Like

@alchemiz Thanks a lot. It worked.

Glad to be of help :smiley:

1 Like