ContributionsMost RecentMost LikesSolutionsMove Salesforce Opportunity to NetSuite Sales Order Created by @dshen This pipeline pattern will take a closed Salesforce opportunity and create a Sales Order in NetSuite, and then update Salesforce with the NetSuite ID. Configuration The pipeline will take a closed Salesforce opportunity ID as the pipeline parameter. The pipeline will assume the Salesforce product and NetSuite item has been synchronized, and the NetSuite item internal ID is stored in each Salesforce product. The pipeline will create a new NetSuite customer if the Salesforce Opportunity Account is not already created. The pipeline will create the NetSuite Sales Order based on the approved Opportunity Quote. Each Salesforce Quote Line Item will create a NetSuite SalesOrder Line Item. Sources: Salesforce Opportunity, Account, Quote, Quote Line Items Targets: NetSuite Customer, Sales Order Snaps used: Salesforce SOQL, NetSuite Create, Salesforce Update, Mapper, Router, Join, Group By Fields, JSON Splitter Downloads SFDC Opportunity to NS SalesOrder.slp (36.8 KB) Export Accounts from Adaptive Insights Contributed by @dshen This pipeline demonstrates how to connect to Adaptive Insights and export accounts using Rest API. Configuration To connect to Adaptive Insights, the user needs to provide the login username and password along with the calling method (e.g. exportAccounts) in the first Snap in order to construct the proper XML payload for the REST Post. Source: Adaptive Insights account Target: no target defined Snaps used: Mapper, Copy, XML Formatter, File Writer, Binary to Document, REST Post Downloads Adaptive Insights- Export Account Sample.slp (9.9 KB) Read candidates from Taleo using Rest API Contributed by @dshen This pipeline demonstrates how to connect to the Taleo using Rest API and fetch candidate data. Configuration In order to connect to Taleo, the user needs to provide the following information in the REST Post Snap (titled Get Taleo authToken): Service URL to the Taleo instance Taleo orgCode of your organization Taleo username and password Source: Taleo Candidate Target: no target defined Snaps used: Rest Post, Rest Get, JSON Splitter Downloads Taleo Get Candidates.slp (5.8 KB) Dropbox Samples via REST API Contributed by @dshen This pattern contains 3 separate segments that demonstrate how to use the REST API to: list folder contents download a file from Dropbox upload a file to Dropbox Source: Dropbox Files Target: Dropbox Files Snaps used: JSON Generator, REST Post, JSON Splitter, Mapper, Document to Binary, CSV Parser Configuration Supply a Dropbox Bearer Token in the pipeline parameters. To list the contents of a folder, supply the Dropbox file path and other parameters in the JSON Generator so it can be passed to Dropbox as the request payload. For uploading a file to Dropbox, this pipeline assumes the file was uploaded to SnapLogic. Reference Dropbox API v2 for HTTP Developers Downloads Dropbox Samples.slp (14.4 KB) How to test pipelines on UAT environment Snaplogic releases its software 4 times a year to all customers to update the Integration Cloud (Control Plane) and Snaplexes (Data Planes). A separate environment (UAT pod) is provided to customers so they can test their existing pipelines against the latest code base and report issues before the GA. The service is provided to customers free of charge and it is up to customers’ decision on whether to take advantage of the offer. UAT schedule: 3 weeks before the official release, the latest code base will be made available to customers on UAT environment. The UAT code base will be refreshed in next 2 weekends to include the latest fixes and updates. Take the Fall 2016 release (4.7) which was released on 11/12/2016 as an example: 10/22/2016: The 1st 4.7 code was made available to customers on UAT environment along with the draft release note on documentation site that details the changes of the release 10/29/2016: The UAT environment was refreshed with the 2nd update of the 4.7 code base along with updated draft release note. 11/5/2016: The UAT environment was refreshed with the 3rd update of the 4.7 code base. The release is in code freeze during the last week before the GA which means only P0 defect will be addressed. 11/12/2016: The official upgrade kicked off at 9:00 PM PST. A 4-hour downtime was planned to complete the upgrade of the Integration Cloud (Control Plane) and the Snaplexes (Data Plane) for every customer (cloud or ground). How to test on UAT: Customer requests the access to UAT though their account team. Snaplogic operation team provisions an org for customer on UAT environment (https://uat.eleastic.snaplogic.com) Customer setup a groundplex to connect to the UAT environment Customer exports the pipelines that need to be verified from the production environment and imports into the UAT environment Customer setup the necessary account credential to connect to data sources and targets Customer verifies the pipelines work as expected and report to Snaplogic if issues occurred. Customer to repeat the testing after weekly updates Migrate projects, pipelines, accounts, tasks across environments One of the common requests from customers is how to move Snaplogic asset (pipelines, accounts and tasks) in a project from one environment to another easily as part of the configuration management or promotion processes. The standard project export/import function will move the pipeline, tasks, files, etc., but the customer will need to re-create the account objects and re-link all the account references in the target environment. This extra step could be a hassle if you have a big project with many pipelines and snaps require account references. The sample pipeline here use Snaplogic Metadata Snapack to read the asset definition from a project and write to the target location (same org or not). It will move the accounts, pipelines and tasks from source to target. More importantly, it will maintain the account references in the pipeline and pipeline reference in the task. User just need to re-enter the Account password and re-select the Task snaplex in the target environment. The attached project export contains 4 pipelines: 01 Main - Migrate Project: This is the main pipeline which will call into the following in sequence to move the assets. 02 Upsert Account - This pipeline will move the Account object in a project to the target location. 03 Upsert Pipeline - This pipeline will move the Pipelines in a project to the target location. 04 Upsert Task - This pipeline will move the Tasks object in a project to the target location. User can specify the source and target location (org + space + project) in the pipeline parameters. To run the pipeline to move project across org, the user account will need to have read/write permission to both source and target location. The project export of those pipelines can be downloaded here: Migrate Project v2.zip (10.7 KB) Snaplogic Triggered Task using an OnPremises URL through Load Balanced Groundplex Nodes This article describes how a triggered task is invoked in the SnapLogic Elastic Integration Platform using an OnPremise URL through load balanced Groundplex nodes. Assume that your organization has a SnapLogic Groundplex provisioned with 3 nodes. When an OnPremise URL is exposed for a triggered task, it will automatically suggest the hostname of one of the nodes that belongs to the Groundplex. e.g., https://GP-Node1:<port>/api/1/rest/feed/<RELATIVE_PATH_TO_TASK>/ To provide redundancy across all nodes in the Groundplex when a triggered task is invoked, a load balancer can be placed in front of the Groundplex nodes. When a load balancer is setup and configured, Snaplogic will use the load balancer in the auto-generated OnPremise URL. e.g., https://GP-LB:<PORT>/api/1/rest/feed/<RELATIVE_PATH_TO_TASK>/ The following diagram describes the flow of network requests made when remotely executing a triggered task using a load balancer OnPremise URL. A remote client invokes the triggered task using the OnPremise URL that points to the load balancer (e.g., GP-LB). The load balancer forwards the request to an active groundplex node. GP-Node1 is selected for the purpose of this example. The groundplex node that receives the triggered task request asks the Control Plane on which node the task should be executed. The Control Plane forwards the request to an active groundplex node. GP-Node2 is selected for the purpose of this example. The triggered task now prepares to be executed on GP-Node2. An HTTPS connection is created between GPNode-1 and GP-Node2 to enable data to be streamed between the nodes. The data is read/write from/to the end points. The response message is sent though GP-Node1 then GP-LB (load balancer) back to the caller. How to send a file to Salesforce as an attachment This example demonstrates how to attach a file to Salesforce as an attachment. The sample pipeline will read a file from source, query an account record from Salesforce and attach the file to the record as an attachment. Salesforce Read will query the account record that the file will be attached to. Extract the Salesforce record Id and OwnerId from the Salesforce output. Use the “Binary to Document” to convert the binary file stream to document for mapping Merge the ID with the File using the Merge option in Join Snap Map the Salesforce Attachment structure Create Salesforce Attachment record using Salesforce Create Snap Configuration Snaps in Use: Salesforce Read, File Reader, Binary to Document, Mapper, Join, Salesforce Create Downloads Salesforce Attachment_2017_02_23.slp (10.3 KB) Snaplogic connect to MS-Access MDB database You can use the JDBC snap to access the MS-Access MDB database by using 3rd party JDBC driver. The example here is using the JDBC driver from UCanAccess. Note that the JDBC driver will have no idea on how to access an SLDB:// location. The database file must be local on the groundplex or somewhere accessible from the groundplex.