Add incoming document to new file row

We want the ability to write a document to a new row within a CSV file. This is only if the document meets certain requirements

Is this possible?
This article from a few years back suggests not and wondered if there has been any advancements

Hi @NAl,

You can write a new row in the file ( I suppose you want “append” ) but that’s supported only for FTP, FTPS and SFTP file protocols.

If you want to replicate this sort of logic, you can follow these steps:

  1. Create empty csv file ( not required )
  2. Read this csv file at the start of the pipeline even if it’s empty.
  3. Read the new data from the other source.
  4. Use a “Gate” snap to combine both inputs.
  5. Concatenate the input that contains the new data with the old data with a Mapper with the following expression: $old_data.concat($new_data)
  6. Split the data with the newly concatenated data.
  7. Overwrite the initial file.

In this case, your new data will be always appended.

The pipeline should look something like this:

2 Likes

Thank you for providing a detailed outline. I’ve been away implementing this and looking at the right expressions to use to split/concatenate the data. I’ve managed to put the pipeline together however the Mapper snap isn’t returning any data based on the expression provided:

Screen Shot 2022-01-19 at 15.13.34

This is a copy of the JSON message before it goes into the Mapper. There is currently no data in the csv file hence the blank message on input0

[
{
“input0”: [

],
"input1": [
  {
    "A": "X",
    "B": "X",
    "C": "60",
    "D": null,
    "E": "X",
    "F": "",
    "G": "X",
    "H": "X",
    "I": "1320159",
    "J": 1430047,
    "K": "X",
    "L": "X",
    "M": "X",
    "N": "X",
    "O": false
  }
]

}
]

If input0 contains the new data in your pipeline, you should write the following expression:

$input1.concat($input0)

This is working fine for me. Result:

[
   {
      "A":"X",
      "B":"X",
      "C":"60",
      "D":null,
      "E":"X",
      "F":"",
      "G":"X",
      "H":"X",
      "I":"1320159",
      "J":1430047,
      "K":"X",
      "L":"X",
      "M":"X",
      "N":"X",
      "O":false
   }
]

Okay so we’re getting there.
I’ve re-configured the Mapper and applied a few other changes but now the pipeline is returning an error:

Stacktrace: Metadata failed to load for ‘insert file location here’
Resolution: Check for URL syntax and file access permission

I read another article on here and wonder if it has something to do with setting up an Account in the File Reader Snap using Basic Auth?

The file is situated in a folder in SnapLogic under my username.

If the file is located within SLDB then there’s no need for any type of account.
Can you please share the “File*” attribute from the File Reader snap ?

Screen Shot 2022-01-20 at 11.23.34

This look fine to me. Did you check if the file is missing from the location by any chance ?

Looks like its accepted the full file location instead of the file name on it’s own.
Just testing this again.

Hello, so the general running of this part of the pipeline is running without triggering the error pipeline but it doesn’t seem to be pulling in the data from the file and the concatenation expression in the Mapper snap still isn’t quite right. It’s difficult to post screenshots on here as there is sensitive data in the JSON messages.

Hello, here’s an update.
I found this article and used the expression to update the one on my Mapper snap. The resulting dat structure in the target path is the desired outcome!

Screen Shot 2022-01-20 at 17.42.35

This is great stuff so far. I’ll probably take a break now from testing and have some dinner before recommencing tomorrow. Thank you @j.angelevski for your patience.

Hello, couldn’t stay away. This is now working.