Azure SQL to Salesforce Upsert
Created by Chris Ward, SnapLogic This pattern connects to an Azure SQL database to perform a query data and then Upserts the results to an object in Salesforce. Both error and successful records are written to a log file within SLDB. The pattern could be used to migrate customer data stored in a transactional Azure SQL table into Salesforce for Sales prospecting. Both Azure SQL & Salesforce accounts are required and the necessary mappings specified in the Mapper Snap. Configuration Sources: Azure SQL Table Targets: Any Salesforce Object Snaps used: Azure SQL Execute, Mapper, Salesforce Upsert, JSON Formatter, File Writer Downloads Azure SQL to Salesforce Upsert.slp (12.9 KB)1.9KViews0likes0CommentsMicrosoft Dynamics 365 - Finance & Operations to Salesforce-Upsert of Customer (Account) data
Created by @asharifian, SnapLogic This pattern illustrates the upsert of customer account details from Dynamics 365 - Finance & Operations to Salesforce. Key points: API’s utilizing the data management framework from Dynamics 365 - Finance & Operations are utilized for this integration. Specifically, triggering the export project in Dynamics 365 - Finance & Operations and retrieving the generated export data from Dynamics 365 - Finance & Operations. The Dynamics 365 - Finance & Operations bulk export feature is used. The pipeline will first trigger an export in Dynamics 365 - Finance & Operations, then generate the exported data (as binary via the output view of the second bulk export snap). The exported data is generated as a zip file. The zip file is decompressed and its data streamed through to be upserted to Salesforce. The past 200 exports execution ID’s are stored for logging. The execution ID can be used to re-generate the export from 365 - Finance & Operations. Configuration A bulk export project in Dynamics 365 - Finance & Operations would need to be created for the entity being used in the pipeline. In the case of this pattern, Customers. In addition, the Bulk Export outputs a different data type when one result is returned vs more than one (an object for a single record and an array for more than one). That is why this pattern has a router to differentiate and handle the two data types. Sources: Dynamics 365 - Finance & Operations Customers Targets: Salesforce Account Snaps used: REST Post, Join, Copy, Mapper, File Reader, JSON Parser, Head JSON Formatter, File Writer, ZipFile Read, Binary to Document, Document to Binary, XML Parser, Router, Union, Filter, Salesforce Upsert Downloads Dynamics FO-Salesforce-Customer Upsert.slp (47.2 KB)2.3KViews0likes0CommentsMicrosoft Dynamics 365 - Finance & Operations to Salesforce-Customer Upsert
Created by @asharifian, SnapLogic This pattern illustrates the insert of customer accounts from Salesforce to Dynamics 365 - Finance & Operations. Key points: The Dynamics 365 - Finance & Operations OData REST API is used for the insert operation. A Salesforce Subscriber Snap is used to capture account creations in Salesforce, via a platform event in Salesforce. A customer account identifier value is generated within SnapLogic to send to Dynamics 365 - Finance & Operations, and upon the insert in Dynamics 365 - Finance & Operations, send back to Salesforce. This will marry the record between Dynamics 365 - Finance & Operations and Salesforce. To prevent duplicates, you can modify the pipeline to either: a. Filter out Salesforce records that have the ERP_Customer_Id__c custom field populated because if that field is populated, it means that record was already sent to Dynamics 365 - Finance & Operations in the past. b. Use a Router Snap to update the record in Dynamics 365 - Finance & Operations unless the ERP_Customer_Id__c custom field is already populated in Salesforce. In that case, you can use the Dynamics 365 - Finance & Operations Update custom Snap. Configuration OData API’s need to be enabled/set up in Dynamics 365 - Finance & Operations in order to use them. OData API’s are used for CRUD operations. Sources: Salesforce Account Targets: Dynamics 365 - Finance & Operations Customers Snaps used: Salesforce Subscriber, Salesforce Read, Router, Mapper, Union, Salesforce Update Downloads Salesforce-Dynamics FO Customer Profiles.slp (21.2 KB)2.4KViews0likes0CommentsSalesforce Contacts - Database Replication (SQL Server-MongoDB-Oracle)
Created by @asharifian This pattern provides an example pipeline to replicate data from Salesforce to multiple different database types: SQL Server, MongoDB, and Oracle. Data replication is useful for disaster recovery, analytics, and improves overall system resilience and reliability. Configuration Two approaches are covered in this pattern: Data replication post success of an insert into a dependent database, such as data first getting inserted into MongoDB, and upon success, inserted into Oracle. Data replication in parallel, such as data being inserted to MongoDB and SQL Server in a parallel fashion. Sources: Salesforce Contacts Targets: Database tables for customer contact information in SQL Server, MongoDB, Oracle Snaps used: Salesforce Read, Mapper, Copy, MongoDB - Insert, SQL Server - Insert, Oracle - Insert Downloads Salesforce Contacts-Database Replication (SQL Server-MDB-Oracle).slp (14.5 KB)2.9KViews0likes0CommentsSalesforce to John Galt Atlas - Opinion Lines (Opportunity Line Item Schedules) Integration
Created by @asharifian, SnapLogic This pattern illustrates the integration between Salesforce and John Galt Atlas. Atlas is a supply chain demand planning application. This integration will update opportunity line item schedule (OLIS) quantity information from Salesforce to Atlas. This data is known as opinion lines in Atlas. Key points: A Salesforce platform event is used such that upon the update of the quantity on an OLIS, that will have the SnapLogic pipeline consume the platform event and send that information to Atlas. Pipeline can be in always on state to constantly listen on the platform event. Atlas is being communicated in two ways: 1) via SQL Server to check if an opinion line record already exists. If it does, then update the record, otherwise insert it. 2) Via the Atlas API’s to Insert or Update the record. The Insert/Update operations are done via a child pipeline. Four Salesforce objects are used to retrieve the data. The most important data, Quantity, comes from the OLIS object. The other three objects are Account, Opportunity, and Opportunity Line Item. Configuration Basic auth credentials are passed in as the payload to the REST Post Snap to retrieve the bearer token from John Galt Atlas. This can be done via the HTTP Entity and using the account.username and account.password statements.In addition, ensure the correct “profile” value is being used in the endpoints when calling Atlas. Sources: Salesforce Opportunity, Opportunity Line Item, Opportunity Line Item Schedule Targets: John Galt Atlas Forecast Opportunity Snaps used: Salesforce Subscriber, Salesforce Read, Mapper, Copy, Router, Join, Filter, SQL Server - Select, REST Post, REST Patch, Union, Pipeline Execute Downloads Salesforce-Atlas_Forecast Product.slp (35.4 KB) z_Salesforce-Atlas_Forecast Product.slp (12.8 KB)2.2KViews0likes0CommentsJohn Galt Atlas to Salesforce of Forecast Data
Contributed by @asharifian, SnapLogic This pattern provides an integration between John Galt Atlas and Salesforce. Atlas is a supply chain demand planning application. This integration will insert and update annual forecast data from Atlas to Salesforce. Key points: The retrieval of data from Atlas uses the export custom view in JSON API. Once the forecast data is pulled from Atlas, the integration will do the following in Salesforce: upsert to Opportunities, create opportunity line items (if they don’t already exist in Salesforce), and insert/update the data to opportunity line item schedules. In general, the opportunity object must first be creted, then the opportunity line item, then the opportunity line item schedule. This integration will generally be run sparingly, maybe once a month to ensure the fiscal year’s forecast data is up to date. Configuration A view from John Galt Atlas would need to be created beforehand. The view would contain the data points that you would like to retrieve from Atlas. From there, the “export custom view in JSON” API can be called to get the data from Atlas. The opportunity line item schedule (OLIS) object cannot be upserted in Salesforce, as that is a limitation on the Salesforce side. That is why you see separate Salesforce Create and Salesforce Update snaps for the OLIS object. Source: John Galt Atlas Forecast Opportunity Target: Salesforce Opportunity, Opportunity Line Item, Opportunity Line Item Schedule Snaps used: REST Post, Mapper, CSV Parser, Filter, Salesforce SOQL, Pipeline Execute, Router, Salesforce Update, Salesforce Create, Union, Tail, Salesforce Upsert Downloads Atlas-SF Annual Forecast Opportunities.slp (39.1 KB) z_Atlas-SF Annual Forecast Opportunities-Opportunity.slp (5.7 KB) z_Atlas-SF Annual Forecast Opportunities-OpportunityLineItem.slp (9.9 KB)1.7KViews0likes0CommentsRead data from Salesforce and write to SQS Producer
Created by @sreddi This pipeline pattern reads data from Salesforce with the Salesforce Read Snap and writes data into Amazon SQS using the Amazon SQS Producer Snap. Configuration Specify the Region and Queue name in the SQS Producer Snap. Sources: Salesforce case Targets: SQS Snaps used: Salesforce Read, JSON Formatter, SQS Producer Downloads Reading Data from Salesforce and writing in SQS Producer snap.slp (5.8 KB)2.7KViews0likes0CommentsSalesforce Attachment to Box
Created by @skatpally This pattern downloads a (contract) attachment from Salesforce and writes it to Box. Maximum 25 MB attachment. Configuration Update the File path within the Box Write Snap to the correct directory within your Box instance. Sources: Salesforce attachment Targets: File on Box Snaps used: Salesforce Read, Mapper, Salesforce Attachment Download, Document to Binary, Box Write Downloads Salesforce_attachments_to_Box.slp (8.1 KB)3.3KViews0likes0CommentsCreate Box folders for Salesforce cases
Created by @skatpally For an account created in Salesforce, SnapLogic checks if a Box folder named with the Salesforce case number exists. It it doesn’t exist, SnapLogic creates a new Box folder and names it with the Salesforce case number. SnapLogic also creates a shared link to the newly created Box folder. Configuration Set the SFDCAccount pipeline parameter to the Salesforce Account ID Sources: Salesforce account Targets: Box folder Snaps used: Mapper, Salesforce Read, Sort, Unique, Box Add Folder, REST Put Downloads Create_Box_folders_for_Salesforce_cases.slp (11.9 KB)2.7KViews0likes0CommentsSync leads from Salesforce to Marketo
Created by @mthammareddy For a new lead created in Salesforce, SnapLogic searches for a matching lead in Marketo by email address. If a matching lead is not found in Marketo, SnapLogic creates a new lead. Configuration Specify your access token and endpoint URL in the Pipeline Parameters. The initial Mapper contains and expression to lookup a particular user. You can either replace the ID value every time or define pipeline parameters to pass the value in and change the expression to: "Id='" +_SFDC_ID + "'" where SFDC_ID is the name of the pipeline parameter. Sources: Salesforce Lead Targets: Marketo Lead Snaps used: REST Post, Mapper, Salesforce Read, JSON Generator, Join, Pipeline Execute Downloads Sync_Leads_from_Salesforce_to_Marketo.slp (11.5 KB) Note: This pattern also uses the pattern to get the Marketo access token found here: Get Marketo Access Token2.7KViews1like0Comments