I'm completely stuck trying to write JSON to CSV - it's been 4 days. Can anyone help?

I’m brand new to Snaplogic and this is probably really easy - sorry in advance! But I’ve been stuck on this since Thursday, and it’s now Tuesday… please help me.

As a whole, I have this:

In more detail:

I have a JSON splitter, which has an output of the JSON I want to write to a CSV. And, if I preview the output in table view, it looks exactly like the CSV I want (I had an image here, but new users can only put one - sorry!). Previewing in JSON shows that it’s, well, JSON.

After that, I have a CSV formatter, but I don’t have anything configured in it. Should I? The output view for this is more or less empty, so this clearly isn’t correct. I’ve tried JSON formatters and a million other things here. I just don’t know.

Then I have my file writer (and as a bonus, I want only some of the fields from the JSON to be written to the CSV file).

I can do it in Python:

# Call the API
r = requests.get(url_to_call, headers = headers)
json_to_write = r.json()

# Open a file for writing
outfile = open(file_name,'w')

# Write CSV Header Row
writer.writerow(["Id", "FirstName", "LastName"])

# Loop through each line of the JSON; print key values needed
for x in json_to_write:

I found this post on turning JSON to CSV by @dmiller and this post on writing a Twitter Query to a file, but as far as I can tell, those are just “use these snaps” without showing how the snaps are configured, and I’ve tried those configurations of snaps.

I’m so lost. Any help appreciated.

Hmm, can you reply to your own message with the screenshot of the preview to workaround?

(We might want to look into removing this restriction)

Can you download the pipeline (see here) and attach it here so we can take a look at the configuration?

The default configuration of the CSV formatter should probably just work.

Thanks for your reply!

That’s a good idea. I hit the one image limit again, so I put it all together in one image.


(Note that I’m only pulling Id, FirstName, LastName, and Username from the JSON - it has a lot in it).

I’ll look at that doc on downloading the pipeline and get back to you in another reply. I have to scrub it of all the private info first).

Okay - theoretically it’s attached.ISS Example_2018_09_25(1).slp (6.5 KB)

(I changed the API to a test API, but it has the same structure as my data, so if we can get this working, my data should also work). Thank you!

Looking into the image restriction now @tstack

Bumped it up to 5 for now.

In your CSVFormatter, you’re using ‘$Username’ as one of the header and the resulting output is a column of empty strings. I assume that’s what you’re seeing to, yes? Note, that it’s blank because there is no property named ‘Username’ in the input document. There is a property named ‘userId’, though.

So, if you change the header to ‘$userId’, you should see data coming out of the CSVFormatter. If you want to change the name of the property, you’ll need to use a Mapper snap, like so:

I’m wondering if you were expecting the order of the list of headers in the CSVFormatter to correspond to the order of the columns in the preview? Is that right? Or, perhaps, I’m reading too much into this…

1 Like

Oh hmmmmm yes I understand. What was throwing me off is that if you validate the pipeline than at every snap, like after the JSON splitter, if you click the green output doc in the snap connector on the right it shows the correct output , but the CSV formatter’s green doc looks totally wrong (even after I fix the headers), so after it didn’t work the first time, I never checked the actual CSV file again:

I’ll keep a note to myself not to trust what the formatters display.

Thank you so much!! You’re a hero.

Click where it says “Preview…” You’re dealing with BINARY (diamond shape) data at this point, so the normal view doesn’t have the usual JSON data to show.

1 Like

THAT’S A LINK?!! It’s gray and only underlined on hover and looks just like all the other (non-link) parts of the page; I didn’t even think to try. Oh my goodness. Thank you!!


Thanks for the feedback, a bug has been logged.