ContributionsMost RecentMost LikesSolutionsError pipelines and pipeline execute I am currently working on redesigning our error handling functionality. My basic design was to write all errors to a sql server so we can have a dashboard of pipeline issues in the last 24 hours. As a proof of concept I set this up on one of our more complex pipelines that has multiple calls to several different child pipelines in order to perform some processes in parallel. I know the data set I'm working with and know it has exactly 5 errors in it so when I added the error pipeline to everything I was expecting to see 5 lines, but instead I had 40. I'm guessing this happened because I put the error pipeline on both the parent pipeline and the child pipelines. I think the obvious solution would be to only have the error pipeline on the parent pipeline and have the child pipelines just ignore errors, but I was hoping the community could inform me of what best practice on this truly is since I can't seem to find any document that says how you should configure error pipelines when you are working with pipeline execute snaps. Also a secondary question, if I only have the error pipeline in on the parent is there any good way for me to pass the actual name of the snap that caused the error back to the parent so it can be sent to the error pipeline as well? currently it looks like the error documents loaded to my table only get the name of the pipeline execute snap returned, but I'd really like to get the name of the snap in the child pipeline that caused the error since I think it will make debugging much simpler. SolvedRe: Expression help Since I didn't add it before, this is an example of an expression that works, but it has the key I need hard coded I just need to find a way to dynamically get that key from my valcode table instead of having to hard code it jsonPath($, "persons.old[*].names[*].type.detail.id").indexOf(jsonPath($, "valcodes['person-name-types'][*]").filter((value)=>value.code=='LEGAL')[0].id) != -1 ?jsonPath($, "persons.new.names[0].firstName") != jsonPath($, "persons.old[*].names[?(value.type.detail.id == '55110e25-2ec5-421f-82d5-cc98451a019e')].firstName").toString(): false Expression help I am working on a new pipeline for syncing some data. as part of that sync I want to compare my new data against the data in the target system so I don't update any data that is unchanged. in order to do this I have written an expression to do a lookup on some types in the system to ensure that I'm only comparing things of the same type (ex legal names with legal names). I've written the following expression: jsonPath($, "persons.old[*].names[*].type.detail.id").indexOf(jsonPath($, "valcodes['person-name-types'][*]").filter((value)=>value.code=='LEGAL')[0].id) != -1 ?jsonPath($, "persons.new.names[0].firstName") != jsonPath($, "persons.old[*].names[jsonPath($, "persons.old[*].names[*].type.detail.id").indexOf(jsonPath($, "valcodes['person-name-types'][*]").filter((value)=>value.code=='LEGAL')[0].id)].firstName"): false which is supposed to get the index of the proper name in the document and then if it finds the type exists in the list it's supposed to check if the old name of that type is different from the new name of that type. Where I'm struggling is with the index check, the expression jsonPath($, "persons.old[*].names[*].type.detail.id").indexOf(jsonPath($, "valcodes['person-name-types'][*]").filter((value)=>value.code=='LEGAL')[0].id) when run alone returns an index for me generally a 0 or 1, which I was hoping to use to get the proper name item in the list, but when I attempt to add the lookup in place of the * in names[*] I get the error: Expression parsing failed near -- nPath($, " >> persons << .colleagu (Reason: Mismatched input 'jsonPath' at line 1:273. Expecting one of: {OctNumber, HexNumber, Float, Digit, QuotedString, NonTerminatedString, Regex, 'match', '[', '{', '(', 'true', 'false', 'function', 'null', 'NaN', 'Infinity', Id, '@'}; Resolution: Please check expression syntax) I assume this is due to me attempting to use a jsonPath inside a jsonPath, but I'm not entirely sure what to do about it. I'll also note, this sequence does work if I hard code the ID I'm attempting to search for using ? inside the jsonPath statement, however I really want to avoid doing that as the ID I'm looking for will vary between dev test and production Re: Formatting variable JSON Thanks for the help! That didn’t get me all the way there, but after looking at it I realized I could use another map to get the values I needed. In the end I did this $Key.split(',').map(x=>{"id":x}).map(x=>{"key":x}) which got me the exact output I was looking for Formatting variable JSON I am working on an integration between two endpoints that deal with values in very different ways and I’m struggling to convert between them. The source I’m getting data from sends out data with mutliple values all on the same element so my input document looks like this [ { "Key": "V1,V2,V3" }, { "Key": "V1" }, { "Key": "V1" }, { "Key": "V1,V2,V3,V4" } ] but my output document needs to look like this [ { "keys": [ { "key": { "id": "V1" } }, { "key": { "id": "V2" } }, { "key": { "id": "V3" } } ] }, { "keys": [ { "key": { "id": "V1" } } ] }, { "keys": [ { "key": { "id": "V1" } } ] }, { "keys": [ { "key": { "id": "V1" } }, { "key": { "id": "V2" } }, { "key": { "id": "V3" } }, { "key": { "id": "V4" } } ] } ] Does anyone have suggestions for a good way to deal with this conversion? Due to the variable length of each document I’m thinking I might need to use a child pipeline so I can build the output JSON element by element, but I’m not sure if that’s the best way to go about this SolvedGraphQL pagination Does anyone else have experience with using pagination with GraphQL APIs? I am working with a new endpoint that uses GraphQL but the limit and offseset parameters for it are kept in the body of the request which creates issues. I’ve figured out that I can use expressions to set the limit and offset in the body, but even when I set the pagination to a generic entity isn’t empty expression the first request will go through fine, but the second one will return an error stating the request has no body. So I guess in the end I have 2 questions is there a way to ensure the pagination request resends the body of the request when sent through the HTTP client snap is there any better way to deal with graphQL API requests than writing a raw body in the HTTP client snap? if it helps here is a screenshot of my current HTTP client snap setup Re: How does a Diff snap determine modified vs inserted? Here’s a picture of my pipeline, I’m using a CSV file from a 3rd party system as the new file and then I’m using a select statement from my database as the original document. I run them both through a mapper to ensure that the column headings both match and the sort is on the unique ID. I’ve also run the pipeline normally and in validation mode. For some reason 2 records keep coming through as insertions even though the unique ID exists in both files. All the other files flow correctly to either deleted or unmodified (there were other modified records from before, but they all properly flowed to the update output when I ran the pipeline normally earlier which can be seen here So something about these 2 records is making the diff snap think they are new when they really are not. How does a Diff snap determine modified vs inserted? I’m currently using a Diff snap as part of a pipeline that inserts data into a SQL server for reporting purposes. I’m using a diff snap so that I can determine if a record needs to be updated or inserted, the problem I’m having is that 2 of the records I would expect to be coming through as modified records are showing up as New instead, which is causing a primary key error when I hit the insert snap. I thought that the record I set as the sort path as acting as a key for determining if the record was new or not, but now I’m not sure that’s the case. Would anyone be able to explain to me what could be going on and how I might be able to resolve it? Ideally I’d like the diff snap to only be looking at the ID column to decide if a record is new or not.