Recent Discussions
Azure Service Bus Integration
Created by @elesh.mistry The video that was here is no longer available. Pipelines demonstrated in this video: Azure Service Bus (Write) V3_2018_06_22.slp (12.2 KB) Azure Service Bus (Read and Delete) V2_2018_06_22.slp (12.5 KB)dmiller8 years agoFormer Employee10KViews1like17CommentsConnecting to Marketo with the REST Snap Pack
While SnapLogic’s REST OAuth account supports only OAuth 2.0, it does not work with Marketo’s OAuth implementation. To work with Marketo, you must authenticate manually using the REST Get Snap. In this pipeline, we pass the credentials in as pipeline parameters. Note: This method does expose your credentials in the pipeline. Authorization To simplify the process, define the following pipeline parameters url: the REST API URL for your Marketo instance, like: https://xxx-xxx-xxx.mktorest.com clientID: The clientID for API access. clientKey: The client secret for API access. Add a REST Get Snap (labeled Marketo Login here) and configure as follows: For Service URL, toggle on the Expression button ( = ) and set the field to: _url + '/identity/oauth/token?grant_type=client_credentials&client_id=' + _clientID + '&client_secret=' +_clientKey Remove the input view. Validate the Snap and it will a return a response that contains an access_token and scope. In this example, we follow the REST Get with a Mapper Snap to map the token outside of the array. Using the Access Token In subsequent Snaps, we pass this token as a header, rather than a query parameter because it simplifies paged operations such as Get Lead Changes. Here’s an example of a simple call which does this. For Service URL, toggle on the Expression button ( = ) and set the field to: _url + '/rest/v1/activities/types.json' Under HTTP Header, set Key to Authorization and Value with the Expression button ( = ) toggled on to ‘Bearer ‘ + $accessToken Paged Operations When you get to more complex operations, such as getting lead changes, you need to make two API calls: the first creates a paging token, and the second uses the paging token typically with the paging mechanism enabled in our REST GET Snap. Get Paging Token In this REST Get Snap (renamed Get Paging Token for clarity) is where you specify the query parameters. For instance, if you want to get lead changes since a particular date, you’d pass that in via “sinceDateTime”. The example provided uses a literal string, but could be a pipeline parameter or ideally one of a Date objects formatted to match what Marketo expects. _url + '/rest/v1/activities/pagingtoken.json' Configure Paging Mechanism When calling Get Leads (via a REST GET Snap), a few things to bear in mind: You need to pass “nextPageToken” as a query parameter, along with the fields you want back. Ideally, the list of fields should be in a pipeline parameter because they appear twice in this configuration. The leads will be returned in $entity.result, which is an array. This field will not exist if there are no results, so you need to enable “Null safe” on a Splitter Snap after this REST Get. Paging expressions for the REST Get Snap are: Has next: $entity.moreResult == true Next URL: '%s/rest/v1/activities/leadchanges.json?nextPageToken=%s&fields=firstName,lastName'.sprintf( _url, $entity.nextPageToken ) API Throttling Marketo throttles API calls. Their documentation says “100 API calls in a 20 second window”. Since our REST Snap paging now includes an option to wait for X seconds or milliseconds between requests, use it whenever you are retrieving paginated results. Downloads Marketo REST.slp (14.7 KB)10KViews3likes11CommentsCI/CD Solution with Bitbucket
Submitted by @Linus and @uchohan The pipelines in this solution are for a proposed CI/CD process. The implementation and documentation will enable the following capabilities Ability to source control any SnapLogic Pipeline, Task and Account Commit entire project Commit individual asset Specify commit message Specify branch name Automatic Bitbucket project, repository and branch creation Automatic Bitbucket CI/CD file upload and Pipeline enablement Automatic SnapLogic project space, project and asset creation Ability to pull assets from Bitbucket to a SnapLogic project Revert changes based on specific branch Revert entire project or specific asset SnapLogic Compare Pipeline review Bitbucket Pull Request creation, approval and merge Automatic promotion of assets from development to production Terminology A SnapLogic project space (belongs to a SnapLogic organization) will be mapped to a Bitbucket repository A SnapLogic project (belongs to a SnapLogic project space) will be mapped to a Bitbucket repository (belongs to a Bitbucket project) Each repository will have 1 or more Bitbucket branches. By default, the master branch will reflect the state of assets in the SnapLogic production organization. Additional branches (feature branches) inherits the master branch and will reflect various new development efforts in the SnapLogic development organization Developer assets Each SnapLogic user that should be involved in committing or pulling assets to the Bitbucket space could have its unique and individual assets. It is recommended that each user duplicates the User_Bitbucket project and replaces User with its unique name. Although covered in greater detail in the attached PDF, the User_Bitbucket project holds these four Pipelines, each containing a single Snap: Commit Project - Commits any Pipelines, Accounts and Tasks within the specified SnapLogic project, to the specified branch in Bitbucket Commit Asset - Commits the specified asset within the specified SnapLogic project, to the specified branch in Bitbucket Pull Project - Reads any Pipelines, Accounts and Tasks from the specified branch in the specified Bitbucket, to the specified project and organization of SnapLogic Pull Asset - Reads the specified asset from the specified branch in the specified Bitbucket, to the specified project and organization of SnapLogic For each Pipeline, each user needs to update the bitbucket_account Pipeline Parameter in the respective Snaps, matching the path to their own Bitbucket Account. Downloads Documentation CI_CD Documentation.pdf (1.3 MB) For User_Bitbucket project: Commit Asset.slp (3.5 KB) Commit Project.slp (3.4 KB) Pull Asset.slp (3.6 KB) Pull Project.slp (3.6 KB) Note: These pipelines all rely on shared pipelines located in a CICD-BitBucket project. Make sure to update the mappings to the pipelines within the CICd-BitBucket project to your location. For CICD-BitBucket project: 1.0 Main - SL Project to Bitbucket.slp (17.5 KB) 1.1 Create Project and Repo.slp (19.2 KB) 1.2 SL Asset to Bitbucket.slp (14.8 KB) 2.0 Main - Migrate Assets To SL.slp (23.1 KB) 2.1 Upsert Space And Project.slp (16.4 KB) 2.2 Read Assets.slp (29.3 KB) 2.2.1 Upsert Pipeline To SL.slp (12.8 KB) 2.2.2 Upsert Account To SL.slp (17.9 KB) 2.2.3 Upsert Task To SL.slp (21.2 KB) PromotionRequest.slp (26.0 KB)Linus5 years agoFormer Employee9KViews5likes6CommentsConvert Excel Columns to JSON Arrays
Contributed by @hkaplan This pipeline reads in an Excel file with multiple columns, 1 to n number of rows with null cells and converts each Excel column into a JSON array with all of the null rows removed. Configuration The trick here is SnapLogic’s Aggregator Snap’s Concat function. Source: Excel spreadsheet Columnar data Target: Excel columnar data as an array Snaps used: File Reader, Excel Parser, Aggregate, Mapper, The sample Excel file looks like this: The resulting array looks like this: Downloads Convert Excel Columns to Arrays.slp (12.6 KB) ExcelColumnToArray.xlsx (8.7 KB)hkaplan8 years agoFormer Employee8.5KViews1like0CommentsEncrypt and decrypt sensitive data in a source
Created by @pavan This pipeline pattern will encrypt fields passed as JSON docs using a defined transform type (AES), and decrypts and gives back the original message. This pattern is useful for encrypting sensitive messages (credit card info, SSN, patients name, DOB etc). Configuration Within the JSON Generator, replace “Enter certificate here” with your own certificate. Sources: JSON Targets: JSON Snaps used: JSON Generator, Encrypt Field, Mapper, Decrypt Field Downloads Encrypt & Decrypt Fields.slp (9.1 KB)pavan7 years agoFormer Employee8KViews0likes6CommentsArchiving Files
Submitted by @stodoroska from Interworks This pipeline reads the file from source location, writes it to the archive location, and after that deletes the files from source location. Configuration Source and Archive location are configured using pipeline parameters. Sources: Files on a File sharing system Targets: Files on a File sharing system Snaps used: File Reader, File Writer, and File Delete Downloads Generic.Archive.slp (5.9 KB)dmiller7 years agoFormer Employee8KViews1like10CommentsReference Implementation - Integration Pattern - On-Prem Database to Cloud Datawarehouse
This is a common integration patterns that customers tend to use as they are moving from on-prem to cloud. The following videos describe this integration pattern and the attached reference implementation pipelines help you get a head start on the implementation. Part 1 Part 2 Pipelines Attached The pipeline moves all the tables from an Oracle database to Redshift. The pipelines are designed to use our best practices like parameterization of all aspects of the pipeline (eg. account names , db names, table names). All are dynamic and controlled via pipeline parameters. Source: Oracle Tables Target: Redshift Tables Snaps used: Step 1: Oracle - Table List, Filter, Oracle - Execute, Mapper, Pipeline Execute Step 2: Mapper, Redshift - Execute, Router, Join, Oracle - Select, Shard Offsets*, Redshift - Bulk Load, Pipeline Execute, Exit, Union Step 3: Mapper, Oracle - Select, Redshift - Bulk Load, Exit *This is a Snap developed by the SnapLogic Professional Services team. Downloads Pattern 1 - Step 1.0 - Oracle to Redshift Parent_2018_06_25.slp (9.4 KB) Pattern 1 - Step 1.1 - Oracle to Redshift No Shredding_2018_06_25.slp (20.9 KB) Pattern 1 - Step 1.2 - Oracle to Redshift Shred_2018_06_25.slp (7.7 KB)pkona8 years agoFormer Employee7.9KViews1like8CommentsMigration Patterns
The following patterns migrate assets from one project to another in the same org. These patterns make use of the SnapLogic Metadata Snaps. Source: Existing accounts, files, and pipelines within SnapLogic Target: A second project within SnapLogic Snaps used: SnapLogic Metadata Snaps, Mapper, Requirements You must have access to both projects. You will need to define the following pipeline parameters: source_path, in the form of /orgname/projectspace/project target_path, in the form of /orgname/projectspace/project Migrate Accounts The SnapLogic List Snap gathers the list of accounts in the specified source_path parameter. The SnapLogic Read Snap reads the incoming $path for the accounts. The Mapper Snap maps the target path. The SnapLogic Create Snap writes the accounts to the target location. Migrate Files The SnapLogic List Snap gathers the list of files in the specified source_path parameter. The Mapper Snap maps the source path to a $source_path field for use in the Read Snap. The SnapLogic Read Snap reads the incoming $path for the files. The SnapLogic Create Snap writes the files to the target location. Migrate Pipelines The SnapLogic List Snap gathers the list of pipelines in the specified source_path parameter. The SnapLogic Read Snap reads the incoming $path for the pipelines. The Mapper Snap maps the target path. The SnapLogic Create Snap writes the pipelines to the target location. Pipeline Downloads Migrate Accounts.slp (5.7 KB) Migrate Files.slp (6.0 KB) Migrate Pipelines.slp (5.7 KB)dmiller8 years agoFormer Employee7.7KViews4likes9CommentsAbout the Pipeline Patterns Catalog for IIP category
This section contains a library of sample pipeline patterns created by the SnapLogic team, customers, and partners and available for you within the product. Learn more about using Pipeline Patterns on snaplogic.com. Are there patterns you would like to see? Let us know. We encourage SnapLogic users to submit their pipeline patterns. Submitted pipeline patterns will be reviewed and certified by the SnapLogic team, and will be added to this library for users to download and use. To submit patterns, please fill out the form under the Pattern Submission box. Accessing Patterns There are several ways in which you can access these patterns. In the Community In this section of the Community, you can search for a pattern. When you find one you want, you can download it from the Community and upload it to the SnapLogic Intelligent Integration platform by importing a pipeline. in Designer Within the product, you can find the same set of patterns in Designer under the Patterns tab in the Cloud Pattern Catalog. In Studio (in progress) As part of the new UI known as Studio, we are implementing a new pattern catalog with in product documentation. Let us know what you think!dmiller8 years agoFormer Employee7.7KViews0likes5CommentsFlatten JSON files into CSV files
Created by @schelluri The pipeline pattern flattens a JSON file, which has multiple objects, and turns it into a CSV file. Configuration Sources: JSON Generator Targets: CSV file Snaps used: JSON Generator, JSON Formatter, JSON Parser, Script, CSV Formatter, File Writer Downloads MS_Flatten_Script.slp (31.4 KB)7.4KViews1like2Comments