03-24-2021 08:28 AM
Hi All,
I have a Pipeline that takes data from a Rest API and converts it to a SQL database. This is all working and we are able to create the table without having to define the Schema in a mapper (we have hundreds of tables with thousands of fields and it would be very difficult to map them all individually).
My current issue is that when converting from the JSON to SQL Server it is automatically converting the type from string to varchar(8000). We have Japanese characters etc that are being lost because of this conversion. I need to have it be nvarchar(4000) or if the string is >4000 be nvarchar max.
The question is how do I make that conversion without having to map/define each and every field. And at the same time making sure I don’t truncate anything.
Thanks for your help.
06-29-2022 07:03 AM
Hi Garrett,
were you able to resolve this?
I have similar situation. I am trying to pull the pipeline execution history using the Rest API and load it into SQL table. Will you be able to share your sample pipeline that is using rest api to pull the data from the source and load into sql table?
I appreciate your help!