Error Handling in SnapLogic - InterWorks
Hello We have a blog post posted on the topic Error Handling and Best Practices on our website which i am sending the URL below. We cover most of the stuff you need to know about error handling and what are the best practices to implement in the pipelines for the error handling. Please be free to leave a feedback in the comments section and if you have any further questions you can contact me. https://interworks.com.mk/error-handling-in-snaplogic/2.3KViews7likes0CommentsCI/CD Solution with Bitbucket
Submitted by @Linus and @uchohan The pipelines in this solution are for a proposed CI/CD process. The implementation and documentation will enable the following capabilities Ability to source control any SnapLogic Pipeline, Task and Account Commit entire project Commit individual asset Specify commit message Specify branch name Automatic Bitbucket project, repository and branch creation Automatic Bitbucket CI/CD file upload and Pipeline enablement Automatic SnapLogic project space, project and asset creation Ability to pull assets from Bitbucket to a SnapLogic project Revert changes based on specific branch Revert entire project or specific asset SnapLogic Compare Pipeline review Bitbucket Pull Request creation, approval and merge Automatic promotion of assets from development to production Terminology A SnapLogic project space (belongs to a SnapLogic organization) will be mapped to a Bitbucket repository A SnapLogic project (belongs to a SnapLogic project space) will be mapped to a Bitbucket repository (belongs to a Bitbucket project) Each repository will have 1 or more Bitbucket branches. By default, the master branch will reflect the state of assets in the SnapLogic production organization. Additional branches (feature branches) inherits the master branch and will reflect various new development efforts in the SnapLogic development organization Developer assets Each SnapLogic user that should be involved in committing or pulling assets to the Bitbucket space could have its unique and individual assets. It is recommended that each user duplicates the User_Bitbucket project and replaces User with its unique name. Although covered in greater detail in the attached PDF, the User_Bitbucket project holds these four Pipelines, each containing a single Snap: Commit Project - Commits any Pipelines, Accounts and Tasks within the specified SnapLogic project, to the specified branch in Bitbucket Commit Asset - Commits the specified asset within the specified SnapLogic project, to the specified branch in Bitbucket Pull Project - Reads any Pipelines, Accounts and Tasks from the specified branch in the specified Bitbucket, to the specified project and organization of SnapLogic Pull Asset - Reads the specified asset from the specified branch in the specified Bitbucket, to the specified project and organization of SnapLogic For each Pipeline, each user needs to update the bitbucket_account Pipeline Parameter in the respective Snaps, matching the path to their own Bitbucket Account. Downloads Documentation CI_CD Documentation.pdf (1.3 MB) For User_Bitbucket project: Commit Asset.slp (3.5 KB) Commit Project.slp (3.4 KB) Pull Asset.slp (3.6 KB) Pull Project.slp (3.6 KB) Note: These pipelines all rely on shared pipelines located in a CICD-BitBucket project. Make sure to update the mappings to the pipelines within the CICd-BitBucket project to your location. For CICD-BitBucket project: 1.0 Main - SL Project to Bitbucket.slp (17.5 KB) 1.1 Create Project and Repo.slp (19.2 KB) 1.2 SL Asset to Bitbucket.slp (14.8 KB) 2.0 Main - Migrate Assets To SL.slp (23.1 KB) 2.1 Upsert Space And Project.slp (16.4 KB) 2.2 Read Assets.slp (29.3 KB) 2.2.1 Upsert Pipeline To SL.slp (12.8 KB) 2.2.2 Upsert Account To SL.slp (17.9 KB) 2.2.3 Upsert Task To SL.slp (21.2 KB) PromotionRequest.slp (26.0 KB)9KViews5likes6CommentsWe Need You: Influence Our Next Big Thing
Hey there, I’m Jackie Curry, a new member of the user experience team at SnapLogic. We’re working on some exciting improvements to SnapLogic and I need your help to influence our upcoming products through user research. We’d like to show you some early concepts of what we’re working on to get your feedback and input. We want to know if these changes would help you do your job more efficiently. Plus, as a thank you, we’re offering a $100 gift card incentive to all participants who complete the study. If you’re interested, please email me at designresearch@snaplogic.com and I’ll set up a time to chat. Don’t worry, I promise to keep it brief! Thank you, Jackie – Jackie Curry Principal User Experience Designer jcurry@snaplogic.com2.4KViews5likes1CommentCSV to Workday Tenant
Submitted by @stodoroska from Interworks This pipeline reads a CSV file, parses the content, then the Workday Write Snap is used to call the web service operation Put_Applicant to write the data into a Workday tenant. Configuration If there is no lookup match in the SQL Server lookup table, we are using MKD as a default code for country code. Sources: CSV file on the File share system Targets: Workday tenant Snaps used: File Reader, CSV Parser, SQL Server - Lookup, Mapper, Union, Workday Write Downloads CSV2Workday.slp (17.3 KB)4.7KViews4likes1CommentConnecting to Marketo with the REST Snap Pack
While SnapLogic’s REST OAuth account supports only OAuth 2.0, it does not work with Marketo’s OAuth implementation. To work with Marketo, you must authenticate manually using the REST Get Snap. In this pipeline, we pass the credentials in as pipeline parameters. Note: This method does expose your credentials in the pipeline. Authorization To simplify the process, define the following pipeline parameters url: the REST API URL for your Marketo instance, like: https://xxx-xxx-xxx.mktorest.com clientID: The clientID for API access. clientKey: The client secret for API access. Add a REST Get Snap (labeled Marketo Login here) and configure as follows: For Service URL, toggle on the Expression button ( = ) and set the field to: _url + '/identity/oauth/token?grant_type=client_credentials&client_id=' + _clientID + '&client_secret=' +_clientKey Remove the input view. Validate the Snap and it will a return a response that contains an access_token and scope. In this example, we follow the REST Get with a Mapper Snap to map the token outside of the array. Using the Access Token In subsequent Snaps, we pass this token as a header, rather than a query parameter because it simplifies paged operations such as Get Lead Changes. Here’s an example of a simple call which does this. For Service URL, toggle on the Expression button ( = ) and set the field to: _url + '/rest/v1/activities/types.json' Under HTTP Header, set Key to Authorization and Value with the Expression button ( = ) toggled on to ‘Bearer ‘ + $accessToken Paged Operations When you get to more complex operations, such as getting lead changes, you need to make two API calls: the first creates a paging token, and the second uses the paging token typically with the paging mechanism enabled in our REST GET Snap. Get Paging Token In this REST Get Snap (renamed Get Paging Token for clarity) is where you specify the query parameters. For instance, if you want to get lead changes since a particular date, you’d pass that in via “sinceDateTime”. The example provided uses a literal string, but could be a pipeline parameter or ideally one of a Date objects formatted to match what Marketo expects. _url + '/rest/v1/activities/pagingtoken.json' Configure Paging Mechanism When calling Get Leads (via a REST GET Snap), a few things to bear in mind: You need to pass “nextPageToken” as a query parameter, along with the fields you want back. Ideally, the list of fields should be in a pipeline parameter because they appear twice in this configuration. The leads will be returned in $entity.result, which is an array. This field will not exist if there are no results, so you need to enable “Null safe” on a Splitter Snap after this REST Get. Paging expressions for the REST Get Snap are: Has next: $entity.moreResult == true Next URL: '%s/rest/v1/activities/leadchanges.json?nextPageToken=%s&fields=firstName,lastName'.sprintf( _url, $entity.nextPageToken ) API Throttling Marketo throttles API calls. Their documentation says “100 API calls in a 20 second window”. Since our REST Snap paging now includes an option to wait for X seconds or milliseconds between requests, use it whenever you are retrieving paginated results. Downloads Marketo REST.slp (14.7 KB)10KViews3likes11CommentsJIRA Search to Email Pattern
This pattern queries JIRA for items submitted within the last day (at the time the pipeline is run) and within the specified projects, then sends out an email. It also uses the routing trick described in Performing an Action when there is no data to send an alternate email if nothing was reported. Source: JIRA issue Target: email Snaps used: JIRA Search, Mapper, Sort, Join, Sequence, Router, Email Sender You will need accounts for JIRA and the Email Senders. Set the following pipeline parameters: emailTo: who will receive the email emailFrom: who is sending the email JIRAurl: the url for you instance of JIRA, for example “https://company.atlassian.net” projects: the JIRA projects you want to query as part of the JQL query in the JIRA Search Snap. The query is set up to search multiple projects (for example, “projects in (ABC,DEF,GHI)” ) Refer to JIRA’s Advanced Searching documentation if you wish to change this query. The information sent in the email includes: Key (sent to be a link to the issue in JIRA), Issue Type, Title, Priority, Submitter, Status, and Assignee. Download Pattern - JIRA All Recent Items.slp (17.1 KB)3KViews3likes0CommentsEmployee Journey: Employee Onboarding/Offboarding
SnapLogic and SnapLogic partner, Eljun LLC, demonstrate an employee onboarding/offboarding automation use case across a multi-endpoint stack. This solution fully automates account and file management once an employee is hired or employee account deprovisioning once an employee is no longer with the company. Applications in use: Workday, Active Directory, ServiceNow, Salesforce Pipelines The following pipelines are used in this solution. Workday Event Based Integration This pipeline is meant to be triggered by hire and termination events in Workday, which will determine which child pipeline to execute. Snaps used: JSON Parser, Mapper, Workday Read, Router, Pipeline Execute, Union Child pipelines: Employee Journey-Onboarding-AD-SNOW-SF, Employee Journey-Offboarding-AD-SNOW-SF Employee Journey-Onboarding-AD-SNOW-SF This pipeline is called by the Workday Event Based Integration pipeline when a user is added. Snaps used: Workday Read, Tail, Copy, JSON Formatter, File Writer, Mapper, Active Directory List Users, Active Directory Create Entry, Active Directory Update Entry, Exit, ServiceNow Query, ServiceNow Insert, Router, ServiceNow Update, ServiceNow Create, Head Employee Journey-Offboarding-AD-SNOW-SF This pipeline is called by the Workday Event Based Integration pipeline when a user is being offboarded. Snaps used: Workday Read, Filter, Head, Copy, Mapper, Active Directory List Users, Active Directory Disable Entry, ServiceNow Query, ServiceNow Delete, Salesforce Read, Salesforce Update Downloads Workday Event Based Integration.slp (23.7 KB) Employee Journey-Onboarding-AD-SNOW-SF.slp (38.4 KB) Employee Journey-Offboarding-AD-SNOW-SF.slp (25.5 KB)4.3KViews2likes0CommentsDynamic Data Pivot
Created by @dwhite An example of how to perform a dynamic pivot on data that needs to be pivoted. The traditional Pivot Snap is static and has to be configured per set of data (Ex. in pipeline). This pipeline shows how one can perform a pivot operation on data with variable fields that could be sent in at runtime instead, so pivot configuration could be done on the fly during the run. Configuration Configure dynamic pivot via parameter values. Enter the number of fields to split to in “nSplitFields” parameter. Enter the field names that are being split in the “splitFields” Parameter in a comma separated list. Enter new fields to generate in the “genFields” parameter in a comma separated list. For actual use, remove the sample data and traditional Pivot, those are only for demonstration and comparison. Sources: Any flat datasource that needs pivoting Targets: Any Snaps used: CSV Generator, Copy, Pivot, Mapper, Sequence, Join, JSON Splitter, Group by n Downloads Dynamic Data Pivot.slp (15.7 KB)3.3KViews2likes1CommentEmployee Journey: Recruiting: Position Creation
This pattern is part of Recruitment Automation of the Employee Journey. SnapLogic and SnapLogic partner, Eljun LLC, demonstrate how the data from a new job position opened up in Workday can be automatically pushed to an applicant tracking system, such as Jobvite. Configuration Applications in use : Workday, Snowflake, Jobvite Snaps used: Mapper, Workday Read, REST Get, Router, REST Put, REST Post, Union, Snowflake - Insert Downloads PositionCreation_2021_03_18.slp (29.2 KB)3KViews2likes0CommentsIngest Salesforce contacts into Azure Storage (WASB)
This pipeline pattern is used to ingest Salesforce contacts into Windows Azure Storage Blob (WASB). Source: Salesforce Contacts Target: Azure Storage, Windows Azure Storage Blob (WASB) Snaps used: Salesforce, Mapper, CSV Formatter, HDFS Download Ingest Salesforce Contacts into Azure Blob Storage_2018_06_28.slp (6.8 KB)1.7KViews1like0Comments