SQL Insert from JSON change type and only type
Hi All, I have a Pipeline that takes data from a Rest API and converts it to a SQL database. This is all working and we are able to create the table without having to define the Schema in a mapper (we have hundreds of tables with thousands of fields and it would be very difficult to map them all individually). My current issue is that when converting from the JSON to SQL Server it is automatically converting the type from string to varchar(8000). We have Japanese characters etc that are being lost because of this conversion. I need to have it be nvarchar(4000) or if the string is >4000 be nvarchar max. The question is how do I make that conversion without having to map/define each and every field. And at the same time making sure I don’t truncate anything. Thanks for your help.2.6KViews0likes1CommentPurple to Redshift-WiFi Analytics
Created by @asharifian , SnapLogic This pattern integrates Purple with Redshift. Purple is a WiFi analytics application that builds detailed visitor profiles and tracks visitors in your venues, such as conferences, workshops, gatherings, etc. At Purple, all of the data collected is stored within a centralized, enterprise-class reporting suite, available for analysis. This pattern retrieves the analytics data from Purple and bulk upserts it into Redshift for reporting and additional data analytics purposes. Configuration A child pipeline is used to generate the HMAC SHA256 key. This is a requirement for the Purple API’s. Sources: Purple Venues, Visitors Targets: Redshift database tables to store venue and visitor data. Snaps used: Mapper, Pipeline Execute, REST Get, JSON Splitter, Copy, Redshift Bulk Upsert, Script, Join Downloads Purple to Redshift-WiFi Analytics.slp (21.4 KB) HMAC_SHA256.slp (18.5 KB)2.4KViews0likes0Comments