Forum Discussion

akarsh's avatar
akarsh
New Contributor III
2 years ago
Solved

Best way to read the data from JSON list and process it one by one to a rest call

I have a input json file like below please see the dummy example below

{ "totalCount": 2, "dataList": [ { "IFA": "IFA1", "voltage": "200", "power": "100", "equipid": "1234ABC", "SN": "SalesForce-1" }, { "IFA": "IFA2", "voltage": "200", "power": "100", "equipid": "1234ABC", "SN": "SalesForce-1" } ] }

I have to read this file and process the Items one by one. 
First I have to get the IFA number and then do Salesforce lookup to get the Id from sales force .

Then the output Id from the salesforce should be mapped to accountID  field and this along with other data in the file should be passed to SalesForce upsert snap.

can anyone help me on how to do this? I am new to snaplogic

  • Hi @acesario

    Do you have the ‘Ignore empty stream’ property set in your Excel Formatter like this:

5 Replies

  • ptaylor's avatar
    ptaylor
    Employee

    Ok, I’m glad that you were able to get it working well with a single topic.

    Performing joins on streaming data in real time is a very advanced subject. To discuss it in detail would require much more information about your use cases and is not a discussion I can really get into here in this forum. I would consider whether it might make sense to read the data into separate Snowflake tables and then use Snowflake to do the joins. If you need true streaming functionality like windowed joins then you might look at KsqlDB or Kafka Streams. It might be possible to do the joins in SnapLogic pipelines but that can get very tricky with real-time streams that don’t end, as our Join is designed for finite input streams. One thing to consider is a hybrid approach where you use KsqlDB to do the joins of the separate Kafka topics, which will produce a new topic containing the joined data. Then use our Kafka Consumer snap to read that topic and insert into Snowflake.