Forum Discussion

LeeDuffy's avatar
LeeDuffy
New Contributor II
9 months ago
Solved

Simple JSON nesting issue

Hi All,

I would like some advice/better solution to my query!

I am preparing some data to send in a table via email, the data structure is as below 

 

[{ "statusCode": 202, "statusText": "Accepted", "payload_file_name": "s3:///bucket/api-payload/archive/api_payload_20241023_140821324_j1.json", "receivedDate": "2024-10-28T13:25:26.6271537Z", "processedDate": "2024-10-28T13:25:26.6271537Z", "rowsCount": 2, "transactionId": "{a0aa71d8-5c0a-4bdc-a39f-a42bf5143c3b}" }]

I am preparing the fields in a parent pipeline and feeding the data as a string to a child, a mapper creates this using data.field_name formatting and "data" is passed as the pipeline parameter.

When the child receives this, I am having some trouble getting the below tweaked code (from another post by koryknick) to map the fields from horizontal to vertical format (easier to read in a quick email notification)

$email_data.entries().slice(0).map(x=> { "Field": x[0], "Value" : x[1] }

This code works fine directly in line with the mapper snap creating the data.field_name data, but when passed to a child, it gives it a set of square braces [...], and the code no longer works as it did. I have tried many forms of grouping and mapping, and come up with what I think is a hacky solution, which something better could replace.

My solution is simply to replace the [ and ] characters to nothing (.replace('[','')) in the incoming string to the child pipeline. I'm convinced there is a better way to do this with the expression language, but can't figure it out!

Without the replacing of the square braces

I've attached a sample pipeline with the mapping and payload as a parameter.

 

Thanks in adance!

 

  • Apologies, the datatype is local datetime. That’s why.

    I believe .toString() will do the job

4 Replies

  • tstack's avatar
    tstack
    Former Employee

    Correct, there is no way to pass a variable “up”. To elaborate on that a bit, the snaps all run in parallel, so there would be a race between the snap that is passing the variable “up” and the snap that is trying to read the variable. For example, if there were two files coming in, the Write snap might see the first file name in one execution and the second file name in another execution.

    • graham-onpoint's avatar
      graham-onpoint
      New Contributor II

      I can see how that would happen, but could that not be controlled with a ForEach Snap that serializes the processing?

      The use case you are thinking of would require a child pipeline due to the race condition, however it is not a requirement for the pipeline I am developing as (a) there will only be one file, and (b) it is acceptable to stop processing and throw an error if there is more than one file.

      • tstack's avatar
        tstack
        Former Employee

        Hmm, I’m not quite sure what you mean here. The PipeExec snap basically obsoletes the ForEach snap.

        That’s not going to be true in the general case. So, we can’t add something so error prone and hope that people only use it in the right situations.