cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

How to Transform SQL Server Records to JSON usable format

bradwyatt
New Contributor II

I am querying transactions_table and batch_table using SQL Server Execute snap. The batch_table returns 1 record, and in this example the transactions_table returns 2 records. I need to be able to pull the data into a JSON-format that is compatible with my companyโ€™s API.

Below is an example format. The โ€œinterfaceโ€ keyโ€™s values are manually entered by me. The โ€œbatch_tableโ€ key contains the values from the one record in the batch_table from SQL Server. The โ€œtransactions_tableโ€ key contains a list of all the records from the transactions_table.

What would be the most efficient way of implementing this?

{
    "interface" : {
        "type" : "INVOICE",
        "client": SQLServer
    },
    "data": {
        "batch_table": {
          "BATCH_ID": 1
        },
        "transactions_table": [{
          "TRANSACTION_ID": 1,
          "AMOUNT": 5.90
          }, 
          {
          "TRANSACTION_ID": 2,
          "AMOUNT: 11.20
        }]
    }
}
6 REPLIES 6

Hi ,

Here is one example how can you achieve the result as in your example in the question above.
{
โ€œinterfaceโ€: {
โ€œtypeโ€: โ€œINVOICEโ€,
โ€œclientโ€: โ€œSQLServerโ€
},
โ€œdataโ€: {
โ€œbatch_tableโ€: {
โ€œBATCH_IDโ€: 1
},
โ€œtransactions_tableโ€: [{
โ€œTRANSACTION_IDโ€: 1,
โ€œAMOUNTโ€: 5.90
}, {
โ€œTRANSACTION_IDโ€: 2,
โ€œAMOUNTโ€: 11.20
}
]
}
}

Transform_SQL_JSON_Results.slp (5.0 KB)

image

In the pipeline, I took the example data, and transformed it as csv, so I put it inside โ€œCSV Generatorโ€ snap.

Later, in order to form the array for โ€œtransactions_tableโ€, with snap โ€œGroup By Fieldsโ€, the data is grouped by type, client and batch_id, so the result looks like this

image

At the end, with the snap โ€œJSON Generatorโ€ the output is formed as you expected to be, using apache velocity

image

I hope this will resolve your problem.

Regards,
Angela.

bradwyatt
New Contributor II

That worked, thank you so much!