🔍 Join our Navigation Study and Help Shape SnapLogic's User Experience!
Hello SnapLogic Community! We’re conducting a study to improve navigation within SnapLogic, and we want your input! By sharing your valuable feedback, you’ll play a pivotal role in improving the user experience of SnapLogic. This is an unmoderated study and should take less than 10 minutes of your time. :memo: STUDY IS CLOSED Your insights matter to us, so don’t miss this opportunity to make a difference! Thank you, The SnapLogic Team3.3KViews6likes3CommentsChecking for optional properties and returning defaults in the expression language
With the Summer 2017 release there have been a couple of enhancements to the expression language to make it easier to check for the existence of properties in an object and returning a default value. Previously, you could use the ‘hasOwnProperty()’ method to check if a property is in an object. Now, objects have a ‘get()’ method to get the value of a property with the given name or return a default value. For example, to get the value of the property named ‘optional’, you can do: $.get('optional') // Returns the value of $optional or null To get the property or a default value if the property is not found, you can do: $.get('optional', 5) Note that there is some subtlety here if the property can be in the object with a value of null. If you pass a default value to the ‘get()’ method and the property is in the object with a value of null, you will get a null result. So, if you wish to get a default if the property is not in the object or if it is null, you should use a logical or with the default value. For example: $.get('nullable') || 5 In addition to the ‘get()’ method, we have added the ‘in’ operator as a shorthand for checking if a property is in an object. 'optional' in $ Error messages for undefined references have also been improved to try and suggest other properties in the object that have names similar to the one that was referenced. For example, in the following screenshot, there is no ‘ms’ field, but the object does have a ‘msg’ field. Here is an exported pipeline that demonstrates some of these features: optional-properties_2017_08_13.slp (4.8 KB)5.7KViews6likes0CommentsExpression library for condensing Reltio REST API response
Before 2017 Winter Release, transforming a Reltio REST API response JSON object into a simple structure can be a tedious task. The expression library feature from 2017 Winter Release can make it much easier. Encouraged by @dmiller, am sharing a pipeline to showcase this technique. In this fictional scenario, the ask is that only output the Mobile phone fields. Attached: Community posting expression library for condensing Reltio REST API response_2017_04_12.slp (7.6 KB) reltio.expr.txt (136 Bytes) Fake_Reltio API response.json.txt (3.1 KB) Phone.json.txt (288 Bytes) How to test the example pipeline: download the pipeline file and reltio.expr.txt rename the reltio.expr.txt to reltio.expr upload the pipeline and reltio.expr to a project generate the previews Content of the expression library file: { ov: y => y.find(x => x.ov == true).value, findByType: (x,type) => x.find(y => this.ov(y.value.Type) == type && y.ov == true) } Pipeline property tab: Pipeline: First mapper: Second mapper: Condensed data from the output: [ { "uri": "entities/geFfGTn", "updatedTime": 1492024546326, "attributes": { "FirstName": "Gr", "LastName": "Moun", "Gender": "Male", "Phone": { "CountryCode": "+1", "Number": "9999990003", "Type": "Mobile" } } } ] The input Reltio API response JSON looked like this: [ { "uri": "entities/geFfGTn", "updatedTime": 1492024546326, "attributes": { "FirstName": [ { "type": "configuration/entityTypes/Individuals/attributes/FirstName", "ov": true, "value": "Gr", "uri": "entities/geFfGTn/attributes/FirstName/sv4pKXS0" } ], "LastName": [ { "type": "configuration/entityTypes/Individuals/attributes/LastName", "ov": true, "value": "Moun", "uri": "entities/geFfGTn/attributes/LastName/sv4pKfyW" } ], "Gender": [ { "type": "configuration/entityTypes/Individuals/attributes/Gender", "ov": true, "value": "Male", "lookupCode": "M", "lookupRawValue": "M", "uri": "entities/geFfGTn/attributes/Gender/19BWhlRa3" } ], "Phone": [ { "label": "Mobile 9999990003", "value": { "Type": [ { "type": "configuration/entityTypes/Individuals/attributes/Phone/attributes/Type", "ov": true, "value": "Mobile", "uri": "entities/geFfGTn/attributes/Phone/19BWhm8Cd/Type/19BWhmKzP" } ], "Number": [ { "type": "configuration/entityTypes/Individuals/attributes/Phone/attributes/Number", "ov": true, "value": "9999990003", "uri": "entities/geFfGTn/attributes/Phone/19BWhm8Cd/Number/19BWhmCSt" } ], "CountryCode": [ { "type": "configuration/entityTypes/Individuals/attributes/Phone/attributes/CountryCode", "ov": true, "value": "+1", "uri": "entities/geFfGTn/attributes/Phone/19BWhm8Cd/CountryCode/19BWhmGj9" } ] }, "ov": true, "uri": "entities/geFfGTn/attributes/Phone/19BWhm8Cd" }, { "label": "Home 5193217654", "value": { "Type": [ { "type": "configuration/entityTypes/Individuals/attributes/Phone/attributes/Type", "ov": true, "value": "Home", "uri": "entities/geFfGTn/attributes/Phone/zUqfhOhP/Type/zUqfhSxf" } ], "Number": [ { "type": "configuration/entityTypes/Individuals/attributes/Phone/attributes/Number", "ov": true, "value": "5193217654", "uri": "entities/geFfGTn/attributes/Phone/zUqfhOhP/Number/zUqfhXDv" } ], "CountryCode": [ { "type": "configuration/entityTypes/Individuals/attributes/Phone/attributes/CountryCode", "ov": true, "value": "+1", "uri": "entities/geFfGTn/attributes/Phone/zUqfhOhP/CountryCode/zUqfhbUB" } ] }, "ov": true, "uri": "entities/geFfGTn/attributes/Phone/zUqfhOhP" } ] } } ]3.7KViews5likes0CommentsSnapLogic DateTime Conversion Guidelines
The SnapLogic Elastic Integration Platform native DATETIME data type is time zone-aware. For service endpoints that do not expose data types that are not time zone-aware, the SnapLogic Expression Language can be used to easily convert to supported 3rd party date, datetime, time, and timestamp data types. The matrix below provides sample conversions between the SnapLogic DATETIME variable, $snapDateTime , and several known 3rd party data types normalized to the US/Pacific time zone. NOTE: For 3rd party DATETIME / TIMESTAMP data types that do not support time zone, the samples assume that these values are being stored normalized to the UTC (Coordinated Universal Time) time zone. In addition, links to reference documentation for 3rd party data types have been provided where applicable. Service/Endpoint Data Type Sample Conversion Expression Amazon Redshift DATE LocalDate.parse($snapDateTime.toLocaleDateString('{“timeZone”:"US/Pacific”}’)) Amazon Redshift TIMESTAMP 1 LocalDateTime.parse($snapDateTime.toLocaleDateTimeString('{“timeZone”:"UTC”}’)) MongoDB DATETIME 3 LocalDateTime.parse($snapDateTime.toLocaleDateTimeString('{“timeZone”:"US/Pacific”}’)) MySQL DATE LocalDate.parse($snapDateTime.toLocaleDateString('{“timeZone”:"US/Pacific”}’)) MySQL DATETIME 1 LocalDateTime.parse($snapDateTime.toLocaleDateTimeString('{“timeZone”:"UTC”}’)) MySQL TIME 1 LocalTime.parse($snapDateTime.toLocaleTimeString('{“timeZone”:"US/Pacific”}’)) MySQL TIMESTAMP 2 LocalDateTime.parse($snapDateTime.toLocaleDateTimeString('{“timeZone”:"US/Pacific”}’)) Oracle DATE LocalDate.parse($snapDateTime.toLocaleDateString('{“timeZone”:"US/Pacific”}’)) Oracle TIMESTAMP 1 LocalDateTime.parse($snapDateTime.toLocaleDateTimeString('{“timeZone”:"UTC”}’)) Oracle TIMESTAMP WITH TIME ZONE 3 LocalDateTime.parse($snapDateTime.toLocaleDateTimeString('{“timeZone”:"US/Pacific”}’)) Oracle TIMESTAMP WITH LOCAL TIME ZONE LocalDateTime.parse($snapDateTime.toLocaleDateTimeString('{“timeZone”:"US/Pacific”}’)) PostgreSQL DATE LocalDate.parse($snapDateTime.toLocaleDateString('{“timeZone”:"US/Pacific”}’)) PostgreSQL TIME 1 LocalTime.parse($snapDateTime.toLocaleTimeString('{“timeZone”:"UTC”}’)) PostgreSQL TIME WITH TIME ZONE 3 LocalTime.parse($snapDateTime.toLocaleTimeString('{“timeZone”:"US/Pacific”}’)) PostgreSQL TIMESTAMP 1 LocalDateTime.parse($snapDateTime.toLocaleDateTimeString('{“timeZone”:"UTC”}’)) PostgreSQL TIMESTAMP WITH TIME ZONE 3 LocalTime.parse($snapDateTime.toLocaleTimeString('{“timeZone”:"US/Pacific”}’)) SAP HANA DATE LocalDate.parse($snapDateTime.toLocaleDateString('{“timeZone”:"US/Pacific”}’)) SAP HANA TIME LocalTime.parse($snapDateTime.toLocaleTimeString('{“timeZone”:"US/Pacific”}’)) SAP HANA SECONDDATE LocalDateTime.parse($snapDateTime.toLocaleDateTimeString('{“timeZone”:"UTC”}’)) SAP HANA TIMESTAMP 1 LocalDateTime.parse($snapDateTime.toLocaleDateTimeString('{“timeZone”:"UTC”}’)) Salesforce DATE LocalDate.parse($snapDateTime.toLocaleDateString('{“timeZone”:"US/Pacific”}’)) Salesforce DATETIME 1 LocalDateTime.parse($snapDateTime.toLocaleDateTimeString('{“timeZone”:"UTC”}’)) Salesforce TIME LocalTime.parse($snapDateTime.toLocaleTimeString('{“timeZone”:"US/Pacific”}’)) SQL Server DATETIME 1 LocalDateTime.parse($snapDateTime.toLocaleDateTimeString('{“timeZone”:"UTC”}’)) SQL Server DATETIME2 1 LocalDateTime.parse($snapDateTime.toLocaleDateTimeString('{“timeZone”:"UTC”}’)) 1 Does not provide time zone support. If provided, time zone offset will be stripped. 2 Supports time zone. Time zone is not stored interally. By default, on a SELECT, the value is normalized to the time zone set for the user’s local session. 3 Supports time zone. Also see Database Data Types in the documentation.16KViews3likes8CommentsDate Epoch Example
An epoch date is either the number of seconds or milliseconds since 1 January 1970 00:00:00 UTC. UNIX and other applications use seconds, while JavaScript uses milliseconds. SnapLogic uses JavaScript as its scripting language, so the epoch date is internally stored as milliseconds since epoch. In the example below, we are passing in the date in seconds since epoch. 1 - Set a project parameter. In my example, I have “dateEpoch” set to 1472799302. 2 - Use the following formula in a mapper snap: Date.parse(Date.parse(1000*_dateEpoch).toString()).toLocaleDateTimeString(‘{“timeZone”:“America/Chicago”, “format”:“MM-dd-yyyy HH:mm:ss”}’) This returns a value of: 09-02-2016 01:55:02 Further examples: Date.parse(1000*_dateEpoch) = “2016-09-02T06:55:02.000 Etc/UTC” Date.parse(1000*_dateEpoch).toString() = “2016-09-02T06:55:02.000Z” date test epoch_2017_02_28.slp (4.6 KB)6.4KViews2likes1CommentCreating elements out of normalized column data for Gravity Forms
I’ve got a JSON ‘entry’ (sample below) that’s coming to me from a rest API, and the elements in that entry consist of all the metadata about a form that the users fill out, as well as all their responses. Each of the user’s responses are normalized by Gravity Forms however, by making a numbered element with the response as the value for each element, and the “_labels” element contains the column or field names for each response. To get to my target database however, I want to use those values inside the labels element as keys for the values that are at the top level. I’ve reviewed a couple of posts here from Patrick that do something similar using a double mapping pipeline, and I was able to make my own pipeline using a JSON splitter to get a JSON of just the labels, but I can’t seem to figure out how to pivot or transpose the data to get a tuple to reinsert into the main document. My gut feel is that I need to get the data I want into its own set of documents, then join that to the original. Essentially I’d like to turn the entries for 1.3: into an element like this: "First": "Post", Sample Entry: "entries": [ { "2": "Creating using REST API v2", "3": "Manually created using the POST route of the REST API v2", "id": "311", "form_id": "176", "post_id": null, "date_created": "2018-10-16 12:43:23", "date_updated": "2018-10-16 19:33:56", "is_starred": "0", "is_read": "1", "ip": "::1", "source_url": "http://localhost/wp.dev/?gf_page=preview&id=176", "user_agent": "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:38.0) Gecko/20100101 Firefox/38.0", "currency": "USD", "payment_status": null, "payment_date": null, "payment_amount": null, "payment_method": "", "transaction_id": null, "is_fulfilled": null, "created_by": "1", "transaction_type": null, "status": "active", "1.3": "Post", "1.6": "Entries 2", "1.2": "", "1.4": "", "1.8": "", "_labels": { "1": { "1.2": "Prefix", "1.3": "First", "1.4": "Middle", "1.6": "Last", "1.8": "Suffix" }, "2": "Untitled", "3": "Untitled" } },4.3KViews1like6CommentsAWS S3 Connection failure
Hello, I am trying to connect to the AWS S3 bucket with the Access Key Id and Secret access key using the AWS S3 Account type in Snaplogic. But it fails to validate and throws an error Failed to validate account:Cause: Access Denied (Service: Amazon S3; Status Code: 403 Using the same credentials I am able to connect with AWS S3 Bucket using AWS CLI and in PostMan as well by selecting Authorization type as AWS Signature. Can someone please help to identify why it is failing to connect in Snaplogic only?3.5KViews1like2CommentsExpression Library: Database DateTime Conversion
The expressions for database datetime conversions listed in the topic SnapLogic DateTime Conversion Guidelines can be updated as follows to build an expression library: { /* * Database DateTime Conversions * Sample conversions between the SnapLogic DATETIME variable and several known 3rd party data types normalized to the US/Pacific time zone. */ toRedshiftDate: x => LocalDate.parse(x.toLocaleDateString({"timeZone":"US/Pacific”})) , toRedshiftTimestamp: x => LocalDateTime.parse(x.toLocaleDateTimeString({"timeZone":"UTC”})) , toMongoDBDateTime: x => LocalDateTime.parse(x.toLocaleDateTimeString({"timeZone":"US/Pacific”})) , toMySQLDate: x => LocalDate.parse(x.toLocaleDateString({“timeZone”:"US/Pacific”})) , toMySQLDateTime: x => LocalDateTime.parse(x.toLocaleDateTimeString({“timeZone”:"UTC”})) , toMySQLTime: x => LocalTime.parse(x.toLocaleTimeString({“timeZone”:"US/Pacific”})) , toMySQLTimeStamp: x => LocalDateTime.parse(x.toLocaleDateTimeString({“timeZone”:"US/Pacific”})) , toOracleDate: x => LocalDate.parse(x.toLocaleDateString({“timeZone”:"US/Pacific”})) , toOracleTimeStamp: x => LocalDateTime.parse(x.toLocaleDateTimeString({“timeZone”:"UTC”})) , toOracleTimeStampTMZ: x => LocalDateTime.parse(x.toLocaleDateTimeString({“timeZone”:"US/Pacific”})) , toOracleTimeStampLocalTMZ: x => LocalDateTime.parse(x.toLocaleDateTimeString({“timeZone”:"US/Pacific”})) , toPostgresDate: x => LocalDate.parse(x.toLocaleDateString({“timeZone”:"US/Pacific”})) , toPostgresTime: x => LocalTime.parse(x.toLocaleTimeString({“timeZone”:"UTC”})) , toPostgresTimeTMZ: x => LocalTime.parse(x.toLocaleTimeString({“timeZone”:"US/Pacific”})) , toPostgresTimeStamp: x => LocalDateTime.parse(x.toLocaleDateTimeString({“timeZone”:"UTC”})) , toPostgresTimeStampTMZ: x => LocalTime.parse(x.toLocaleTimeString({“timeZone”:"US/Pacific”})) , toSAPHANADate: x => LocalDate.parse(x.toLocaleDateString({“timeZone”:"US/Pacific”})) , toSAPHANATime: x => LocalTime.parse(x.toLocaleTimeString({“timeZone”:"US/Pacific”})) , toSAPHANASecondDate: x => LocalDateTime.parse(x.toLocaleDateTimeString({“timeZone”:"UTC”})) , toSAPHANATimeStamp: x => LocalDateTime.parse(x.toLocaleDateTimeString({“timeZone”:"UTC”})) , toSalesforceDate: x => LocalDate.parse(x.toLocaleDateString({“timeZone”:"US/Pacific”})) , toSalesforceDateTime: x => LocalDateTime.parse(x.toLocaleDateTimeString({“timeZone”:"UTC”})) , toSalesforceTime: x => LocalTime.parse(x.toLocaleTimeString({“timeZone”:"US/Pacific”})) , toSQLDateTime: x => LocalDateTime.parse(x.toLocaleDateTimeString({“timeZone”:"UTC”})) , toSQLDateTime2: x => LocalDateTime.parse(x.toLocaleDateTimeString({“timeZone”:"UTC”})) }3.6KViews1like0CommentsRegex Primer Pattern
Created by @msager This pipeline pattern shows how regular expressions can be used in SnapLogic and common examples with 10 use cases therein e.g. Basic Matching, Greedy vs Non-greedy, Grouping, Ranges, Alternation, etc. Configuration Examples include: Basic Matching Global Match Alternation Global Match with case insensitivity Range sets with quantification Range sets with quantification and global Grouping Greedy Non-greedy Non-greedy with grouping Sources: JSON Generator Targets: Outputs results of Regex Snaps used: JSON Generator, Copy, Mapper Downloads Regex Patterns.slp (18.0 KB)3.6KViews1like0CommentsSum of values in array of objects without reduce function
Hello all, I stumbled upon a situation where I have to find out a sum of a certain key in the objects residing in an array without reducing the objects? What I mean to say is, here is the input: [ { key1: value1, key2: 10 }, { key1: value2, key2: 20 }, { key1: value3, key2: 30 } ] The output that I am expecting should look like: [ { key1: value1, key2: 10, sum: 60 }, { key1: value2, key2: 20, sum: 60 }, { key1: value3, key2: 30, sum: 60 } ] The total sum of key 2 values should be pushed into each object as a new key. Can some one please suggest how to deal with this? Reducing won’t be the optimal solution as it would discard the objects and just add a new object with the total sum. Thanks.Solved