Using Github as a code repository for Snaplogic artifacts

Hello SL community!!

Coming from a typical SOA/ESB background and working on tools like TIBCO Businessworks, Webmethods, etc. I was wondering if we have a way of using any code repository tool (Github, etc.) to store our Projects and other artifacts.

I understand we have a intuitive Import/Export capabilities, this is manual and is a explicit activity to be performed by a developer.

I was wondering:

  1. if we have anything (or is there a possibility) to integrate SL and GitHub.
    This should ensure that the Project and its assets gets checked-in into the GitHub repo without us doing any manual import-export.

  2. And that this can be further use to move code from one environment to another (code migration).

I am new to the SL world, but already finding the community really helpful :slight_smile:

All response would be of help :slight_smile:

Thank you!
Sudhendu

I think @Bhavin may have some advice here as his LinkedIn says:

  • Prototyped integrating SnapLogic pipelines with CI/CD tool chain (ex: Jenkins/TeamCity)

(sorry, I like a google search ;))

Am also interested in this as we spend a lot of time and expertise designing and testing pipelines (as well as ad-hoc workflows) and to rely on manual export with no change history/backup/CI is not ideal. We’re using both TeamCity and Jenkins (for CI and CD) on various other platforms with github as our source code repo. Any integration/scripts/plugins/webhooks for SL snaplexes would be great.

(@nganapathiraju I see you posted on the TFS thread that customers are integrating with github - searching “github” in the documentation area has 0 matches, are there some whitepapers/guidance on how to integrate SL with github?)

1 Like

integration with github is achieved via github rest api https://developer.github.com/v3/
From design perspective this is how it works

Create a snaplogic pipeline that uses Meta Snaps - https://doc.snaplogic.com/wiki/display/SD/SnapLogic+Metadata+Snap+Pack get a list of SnapLogic assets (pipelines, tasks, files and accounts)

Invoke GitHub REST api (uses HTTP basic auth - https://doc.snaplogic.com/wiki/display/SD/Basic+Auth )

Read or Write to GitHub

Pipeline uses pipeline param to decouple runtime param from the actual implementation logic, so when you invoke these pipelines you can specify which Snaplogic projects to read, what assets to cin into GitHub, which repo to use on GitHub side and so on.

We have implemented bi-directional flow i.e. you can cin and cout source code from github

Attached SnapLogic project export has all the required files, please note that this is a custom solution, to use it you’ll need to keep your GitHub creds ready (repo name, uname and pwd), create a basic auth account in snaplogic and pass it on to the pipelines.

You may struggle a bit, but don’t give up, keep pounding and eventually you’ll crack it :slight_smile:

Attached SnapLogic project export, please import it using these steps - https://doc.snaplogic.com/wiki/display/SD/How+to+Import+and+Export+Projects

Now for this

And that this can be further use to move code from one environment to another (code migration).

try this API

API Detail:

Syntax = https://elastic.snaplogic.com:443/api/1/rest/public/project/migrate/ORG/SPACE/PROJECT
Authorization Header = Basic Auth, pass on your Snaplogic uname/pwd
Body = application/json

Example:

https://elastic.snaplogic.com:443/api/1/rest/public/project/migrate/ConnectFasterInc/BK/DEV

{

           "dest_path":"/tacobell/projects/bk",
           "asset_types":["File","Job","Account","Pipeline"],
           "async":"true",
           "duplicate_check":"false"

}

Response:

{
“response_map”: {
“status_token”: “6e6600cd-2992-4423-95c3-ffb94293a3bd”,
“status_url”: “http://elastic.snaplogic.com/api/1/rest/public/project/migrate/6e6600cd-2992-4423-95c3-ffb94293a3bd
},
“http_status_code”: 200
}

This runs as an async call and will migrate (copy) everything from ConnecFasterInc/BK/DEV to/tacobell/projects/bk, you can check status of the migration by visiting status_url
If a project already exists and duplicate_check set to false will create another project with the same name appended by (NUMBER) ex: if bk already exists inside /tacobell/projects then subsequent runs will add bk(1), bk(2) and so on, I wish we had an “overwrite” or “merge” parameter option but neverthless this is much easier than META snaps (IMO).

BK-Github Integration.zip (12.1 KB)

7 Likes

This is so helpful Bhavin! Thank you for jotting this down. I wil try this out and let you know how it goes.

I am pretty sure it will be helpful to us as well as larger community here on SnapLogic!

Many thanks for such a lot of detail and depth @Bhavin ! I will pass this onto our project team, a bit too intricate for me :).

Are there any plans to add native support for github/tfs into the product in the future? And integrations with TeamCity/Jenkins/etc?

Mike/Sudhendu, glad that you found it useful, these pipelines could be easily invoked via a jenkins / teamcity pipeline.

Before we jump into jenkins/teamcity or any ci-cd tool-chain we need to understand what kind of “artifacts” SnapLogic generates, at a high level you have a SnapLogic project which resides has a ORG/SPACE/PROJECT hierarchy where ORG is the tenant, space could be mapped to an ORG UNIT for ex: BI, DEV, any PROJECT-NAME and so on and with in each space you have more than one PROJECT.

When it comes to artifact you have

Pipelines

Jobs (Scheduled, Triggered, Ultra)

Accounts

Files (could be any anything, xml, json, script files, xslt , csv and so on which you your pipelines depends upon)

Pipelines are the actual work horse which are invoked via jobs, for lack of a better term pipelines are interpreted by the execution engine (JCC aka node) and hence there is not much to “build”, now with this architecture a typical CI-CD work flow may not directly fit neverthless I have seen customers would still like to leverage their fav CI/CD tool chain to automate as much as they can, tasks like

 SnapLogic assets sharing via Github

 Promoting projects from one ENV to another

 Use Jenkins as the proverbial "rug" that ties the room together :)

Here is an actual implementation, we invoke SnapLogic pipelines (triggered tasks) via Jenkins job as an HTTP call.

Tools required

Pipeline design and Things to know

There are two projects involved

Project_Promotion = Utilizes Meta Snap pack to promote assets within same and different environments.
Github_Integration = Utilizes Meta Snap pack and REST Snap pack to promote assets within same and different environments. We are utilizing github rest api’s and they are invoked via basic_auth i.e. your github login account.

Pipelines:
Project_Promotion has
01 Main - Migrate Project
exposed as a triggered task
calls rest of the pipelines via pipeline execute
pipeline parameters

    source_proj
    target_proj
    include_account
    target_org
    source_space
    include_pipeline
    update_task
    target_space
    update_account
    include_task
    source_org

Example values:

    account_org = ConnectFasterInc
    account_space = LCM
    account_proj = Artifacts
    source_org = ConnectFasterInc
    source_space = LCM
    source_proj = DEV
    target_org = ConnectFasterPOC
    target_space = BK
    target_proj = PROD
    GH_Owner = snapsrepo or your github account username
    GH_Repo = reponame ex: cicd-demo
    GH_Source_Path = relative path to repo ex: BK/DEV (case sensitive)
    include_pipeline = true or false
    include_account = true or false
    include_task = true or false
    update_account = true or false

Finished product would have a Jenkins pipeline that utilizes HTTP request plugin to invoke SnapLogic pipelines that read/write to GitHub and also promote projects from one env to another in SnapLogic. Using pipeline params we decouple source and target location along with what SnapLogic artifacts to “include” during Jenkins job invocation.

CICD-SnapLogic-Projects.zip (31.2 KB)

snaplogic-jenkins.docx (344.9 KB)

4 Likes

Wow this is very helpful.

I have been meaning to try to tackle this problem for over a year now, but never got around to it.

Now I know that it is possible.

Thanks,
TK

Thanks much Bhavin for your valuable input. One suggestion to make the SnapLogic ecosystem more robust - can we have these critical topics documented as blogs. Will be easier for the community to access the knowledge repository.

Best,
YV

Good feedback.

We are currently looking at our various information platforms (docs, community, knowledgebase, blog, white papers, etc.) and our user community makeup to best determine what information should be tracked where.

1 Like

Hi @Bhavin,

The pipeline is really helpful and I managed to do a quick check.
However when I used the pipeline to import projects from Github to SL, I can see the assets being appended by extra values like,

Try Snaps-fce2f794-2e8e-48aa-90de-2cb1e762cb81

Instead of
Try Snaps

Do you know any obvious reason for this?
Am I missing any configuration?

/Krupali

Check the file writer somewhere which is appending the string

Try Snaps-fce2f794-2e8e-48aa-90de-2cb1e762cb81

“Try Snaps” must be appended in the beginning in the expression.

If I am guessing, this is the

‘Try Snaps’ + pipe.ruuid

Let us know if you cannot find it.

1 Like

Thank you for the reply.

The issue is been solved, the last snap had for each asset had ‘+ “-” + pipe.ruuid

like below,

_SL_Target_Proj + “/” + $property_map.info.label.value + “-” + pipe.ruuid

I removed that and it works as as expected. Thanks again.

1 Like

So the solution provided here is just an asset “copy”.
Meaning that it simply copies the original asset from the source place and copy it to the destination place.
The github is just a place to store the location of the original assets.

The promotion pipeline will read the files in github to get the location of the original asset and move(copy) it to destination location.

The problem of this is that, the files store in Github is not being used as source code. If I remove the asset from the source location, then the promotion pipeline will not be able to migrate the asset from source to destination as its doing a copy and the asset is missing.

Is there a way to take the files store in github as true source code and import this source code to a new org? In this case, even someone removes all the assets from source org, by importing the files in Git, we should still be able to migrate all the assets into destination org.

Thanks

See Product Management’s request for information about GitHub Integration in the Enhancement category for the opportunity to provide input into this feature request: GITHUB integration

Hi Bhavin , thanks for this project really helped us a lot. h ad a doubt ,that will this project work fine in case of pipelines having nested pipelines. ie from github to sl .

Hi,

Note I have also shared our current solution for source control in the other thread: GITHUB integration.

Cheers,
C.J.

this is working fine when dealing with default branch… but how do i work with specific branch ?

may be you can use ref parameter in api requests to commit to other branch. I also use this parameter in apis .
Better way is to take this as pipeline parameter and use that in api

I’m trying since two days but unable to do … my rest get is retrieving data for the branch… but put is not putting data into branch… can u explain me how to do

use this as a request body to rest put snap

2 Likes