ContributionsMost RecentMost LikesSolutionsSQL Server to Azure SQL Data Migration Pattern Created by @asharifian, SnapLogic This pattern migrates table data from SQL Server and loads them to Azure SQL. Business case: SQL Server is the transactional database and encounter high data usage. Azure SQL is utilized for analytics and BI reporting. This use case is for a one-time data migration. Thereafter if necessary, a delta pipeline can be created by modifying these pipelines. Configuration Prerequisites: Tables can either be created on the fly, using the “Create table if not present” option in the Azure Bulk Load snap in the child pipeline or, if your tables have specific data type requirements, you can create the tables pior to executing the pipeline. You will need accounts for the Email Sender, Azure SQL, and SQL Server. Pipeline can accomodate the Azure SQL Dynamic account and SQL Server dynamic account, in case you want to have the integration be portable to handle differing databases. Sources: SQL Server Tables Targets: Azure SQL Tables Snaps used: JSON Generator, Pipeline Execute, Tail, Email Sender, Azure SQL - Execute, SQL Server - Execute, Azure SQL - Bulk Load Downloads SQL Server-Azure SQL-Data Migration.slp (8.9 KB) z_SQL Server-Azure SQL-Data Migration.slp (6.8 KB) Microsoft Dynamics 365 - Finance & Operations to Salesforce-Upsert of Customer (Account) data Created by @asharifian, SnapLogic This pattern illustrates the upsert of customer account details from Dynamics 365 - Finance & Operations to Salesforce. Key points: API’s utilizing the data management framework from Dynamics 365 - Finance & Operations are utilized for this integration. Specifically, triggering the export project in Dynamics 365 - Finance & Operations and retrieving the generated export data from Dynamics 365 - Finance & Operations. The Dynamics 365 - Finance & Operations bulk export feature is used. The pipeline will first trigger an export in Dynamics 365 - Finance & Operations, then generate the exported data (as binary via the output view of the second bulk export snap). The exported data is generated as a zip file. The zip file is decompressed and its data streamed through to be upserted to Salesforce. The past 200 exports execution ID’s are stored for logging. The execution ID can be used to re-generate the export from 365 - Finance & Operations. Configuration A bulk export project in Dynamics 365 - Finance & Operations would need to be created for the entity being used in the pipeline. In the case of this pattern, Customers. In addition, the Bulk Export outputs a different data type when one result is returned vs more than one (an object for a single record and an array for more than one). That is why this pattern has a router to differentiate and handle the two data types. Sources: Dynamics 365 - Finance & Operations Customers Targets: Salesforce Account Snaps used: REST Post, Join, Copy, Mapper, File Reader, JSON Parser, Head JSON Formatter, File Writer, ZipFile Read, Binary to Document, Document to Binary, XML Parser, Router, Union, Filter, Salesforce Upsert Downloads Dynamics FO-Salesforce-Customer Upsert.slp (47.2 KB) Microsoft Dynamics 365 - Finance & Operations to Salesforce-Customer Upsert Created by @asharifian, SnapLogic This pattern illustrates the insert of customer accounts from Salesforce to Dynamics 365 - Finance & Operations. Key points: The Dynamics 365 - Finance & Operations OData REST API is used for the insert operation. A Salesforce Subscriber Snap is used to capture account creations in Salesforce, via a platform event in Salesforce. A customer account identifier value is generated within SnapLogic to send to Dynamics 365 - Finance & Operations, and upon the insert in Dynamics 365 - Finance & Operations, send back to Salesforce. This will marry the record between Dynamics 365 - Finance & Operations and Salesforce. To prevent duplicates, you can modify the pipeline to either: a. Filter out Salesforce records that have the ERP_Customer_Id__c custom field populated because if that field is populated, it means that record was already sent to Dynamics 365 - Finance & Operations in the past. b. Use a Router Snap to update the record in Dynamics 365 - Finance & Operations unless the ERP_Customer_Id__c custom field is already populated in Salesforce. In that case, you can use the Dynamics 365 - Finance & Operations Update custom Snap. Configuration OData API’s need to be enabled/set up in Dynamics 365 - Finance & Operations in order to use them. OData API’s are used for CRUD operations. Sources: Salesforce Account Targets: Dynamics 365 - Finance & Operations Customers Snaps used: Salesforce Subscriber, Salesforce Read, Router, Mapper, Union, Salesforce Update Downloads Salesforce-Dynamics FO Customer Profiles.slp (21.2 KB) ServiceNow-Microsoft Dynamics 365-Leads Integration Created by @asharifian, SnapLogic This pattern retrieves leads from ServiceNow and sends them to Microsoft Dynamics 365 for Sales CRM application. Upon success or error, the integration log is sent back to a different table in ServiceNow for monitoring/tracking purposes. Within Dynamics 365, leads can be managed and tracked throughout their lifecycle. This also allows for lead automation to enter your sales funnel, and from there you can utilize social media, campaigns, marketing, etc. to further your sales. Configuration This is a bi-directional integration in that leads data is first retrieved from ServiceNow, created in Dynamics 365, and the result of that creation attempt is sent back to ServiceNow for further logging. Sources: ServiceNow Leads Targets: Microsoft Dynamics 365 Lead Snaps used: Rest GET, Mapper, Rest POST, Copy, Join, Union, JSON Formatter, Binary to Document, Dynamics 365 For Sales Create, JSON Splitter Downloads ServiceNow-MS Dynamics 365-Leads Integration.slp (21.6 KB) Oracle-SAP-Salsify-GetProductByUPC Created by @asharifian, SnapLogic This pattern serves as a triggered task REST endpoint that would allow users to pass in a UPC code and retrieve product details from Salsify. A lookup is done in Oracle for product details and SAP for additional product details as well as the materials that compose the product. That information is then passed into Salsify, an eCommerce platform that will retrieve sales data regarding the product, and send that back to the end-user. Use case is an application integration with Salesforce that can display that information via Salesforce Lightning or Apex. Configuration The material ID from SAP is sent directly to the Salsify Rest GET API to retrieve the eCommerce product sales data. Sources: Oracle and SAP product and material details tables Targets: Salsify Products Snaps used: Mapper, Filter, Oracle - Lookup, SAP Execute, REST Get Downloads Oracle-SAP-Salsify-GetProductByUPC.slp (10.2 KB) Purple to Redshift-WiFi Analytics Created by @asharifian , SnapLogic This pattern integrates Purple with Redshift. Purple is a WiFi analytics application that builds detailed visitor profiles and tracks visitors in your venues, such as conferences, workshops, gatherings, etc. At Purple, all of the data collected is stored within a centralized, enterprise-class reporting suite, available for analysis. This pattern retrieves the analytics data from Purple and bulk upserts it into Redshift for reporting and additional data analytics purposes. Configuration A child pipeline is used to generate the HMAC SHA256 key. This is a requirement for the Purple API’s. Sources: Purple Venues, Visitors Targets: Redshift database tables to store venue and visitor data. Snaps used: Mapper, Pipeline Execute, REST Get, JSON Splitter, Copy, Redshift Bulk Upsert, Script, Join Downloads Purple to Redshift-WiFi Analytics.slp (21.4 KB) HMAC_SHA256.slp (18.5 KB) Salesforce Contacts - Database Replication (SQL Server-MongoDB-Oracle) Created by @asharifian This pattern provides an example pipeline to replicate data from Salesforce to multiple different database types: SQL Server, MongoDB, and Oracle. Data replication is useful for disaster recovery, analytics, and improves overall system resilience and reliability. Configuration Two approaches are covered in this pattern: Data replication post success of an insert into a dependent database, such as data first getting inserted into MongoDB, and upon success, inserted into Oracle. Data replication in parallel, such as data being inserted to MongoDB and SQL Server in a parallel fashion. Sources: Salesforce Contacts Targets: Database tables for customer contact information in SQL Server, MongoDB, Oracle Snaps used: Salesforce Read, Mapper, Copy, MongoDB - Insert, SQL Server - Insert, Oracle - Insert Downloads Salesforce Contacts-Database Replication (SQL Server-MDB-Oracle).slp (14.5 KB) Workday to Oracle and BambooHR of Employee Information Created by @asharifian This pattern provides an HRIS integration from Workday to Oracle and BambooHR. Oracle will store employee information in a database table for analytics purposes. BambooHR is an HRIS system that will store employee information for human resources and onboarding. The source is the Human_Resources service, Get_Workers object in Workday. Configuration You can get a free trial of BambooHR and test out their API’s. API is relatively easy to configure. Sources: Workday Get_Workers Targets: Database Table for Oracle, Employees for BambooHR Snaps used: Workday Read, Mapper, Copy, Oracle - Insert, JSON Splitter, REST Post Downloads Workday to Oracle-BambooHR.slp (13.2 KB) Salesforce to John Galt Atlas - Opinion Lines (Opportunity Line Item Schedules) Integration Created by @asharifian, SnapLogic This pattern illustrates the integration between Salesforce and John Galt Atlas. Atlas is a supply chain demand planning application. This integration will update opportunity line item schedule (OLIS) quantity information from Salesforce to Atlas. This data is known as opinion lines in Atlas. Key points: A Salesforce platform event is used such that upon the update of the quantity on an OLIS, that will have the SnapLogic pipeline consume the platform event and send that information to Atlas. Pipeline can be in always on state to constantly listen on the platform event. Atlas is being communicated in two ways: 1) via SQL Server to check if an opinion line record already exists. If it does, then update the record, otherwise insert it. 2) Via the Atlas API’s to Insert or Update the record. The Insert/Update operations are done via a child pipeline. Four Salesforce objects are used to retrieve the data. The most important data, Quantity, comes from the OLIS object. The other three objects are Account, Opportunity, and Opportunity Line Item. Configuration Basic auth credentials are passed in as the payload to the REST Post Snap to retrieve the bearer token from John Galt Atlas. This can be done via the HTTP Entity and using the account.username and account.password statements.In addition, ensure the correct “profile” value is being used in the endpoints when calling Atlas. Sources: Salesforce Opportunity, Opportunity Line Item, Opportunity Line Item Schedule Targets: John Galt Atlas Forecast Opportunity Snaps used: Salesforce Subscriber, Salesforce Read, Mapper, Copy, Router, Join, Filter, SQL Server - Select, REST Post, REST Patch, Union, Pipeline Execute Downloads Salesforce-Atlas_Forecast Product.slp (35.4 KB) z_Salesforce-Atlas_Forecast Product.slp (12.8 KB) John Galt Atlas to Salesforce of Forecast Data Contributed by @asharifian, SnapLogic This pattern provides an integration between John Galt Atlas and Salesforce. Atlas is a supply chain demand planning application. This integration will insert and update annual forecast data from Atlas to Salesforce. Key points: The retrieval of data from Atlas uses the export custom view in JSON API. Once the forecast data is pulled from Atlas, the integration will do the following in Salesforce: upsert to Opportunities, create opportunity line items (if they don’t already exist in Salesforce), and insert/update the data to opportunity line item schedules. In general, the opportunity object must first be creted, then the opportunity line item, then the opportunity line item schedule. This integration will generally be run sparingly, maybe once a month to ensure the fiscal year’s forecast data is up to date. Configuration A view from John Galt Atlas would need to be created beforehand. The view would contain the data points that you would like to retrieve from Atlas. From there, the “export custom view in JSON” API can be called to get the data from Atlas. The opportunity line item schedule (OLIS) object cannot be upserted in Salesforce, as that is a limitation on the Salesforce side. That is why you see separate Salesforce Create and Salesforce Update snaps for the OLIS object. Source: John Galt Atlas Forecast Opportunity Target: Salesforce Opportunity, Opportunity Line Item, Opportunity Line Item Schedule Snaps used: REST Post, Mapper, CSV Parser, Filter, Salesforce SOQL, Pipeline Execute, Router, Salesforce Update, Salesforce Create, Union, Tail, Salesforce Upsert Downloads Atlas-SF Annual Forecast Opportunities.slp (39.1 KB) z_Atlas-SF Annual Forecast Opportunities-Opportunity.slp (5.7 KB) z_Atlas-SF Annual Forecast Opportunities-OpportunityLineItem.slp (9.9 KB)