Forum Discussion

stephenknilans's avatar
stephenknilans
Contributor
7 years ago

Is there a way to aggregate arrays from several objects in documents from the same source

I was asked to change a relatively complex pipeline in a few ways, and was trying to see if I coud do it without ANY scripting. I figured it might be possible, but I can’t figure out how to do the aggregation without a script object. Is there a way?

I have a number of documents coming from a system that all have a number of objects representing information about a given email. There may be 1 object, or there could be 100. For all of the emails that are found on an object with a given URI, I can split the objects up, so they have the information I need, but they then have to be aggregated into an array containing all of that email information, but ONLY for that URI. Then I can send them through a mapper object I created to create the proper document to process this properly. Then if they want to add or remove fields, or filter somehow, or change the JSON it will be as simple as any other snaplogic mapping.

To be clear, I don’t think splitting it would be hard, and I have a mapper to create the email object from the source coming in at that point, I just have to have a way to put that into an array that, in theory, could be hundreds of objects, so I can push it through the next object to create the document needed to do what they want. All objects with a given URI in the object that contains those objects would have to go into the same array.

My fallback choice is to create a script object that aggregates things internally, and then puts out the documents.

2 Replies

  • AdityaReja's avatar
    AdityaReja
    New Contributor III

    the attachment doesn’t contain .slb file needed to import into snapLogic

    • dmiller's avatar
      dmiller
      Former Employee

      There isn’t an .slb file format.
      Individual pipelines are exported out as .slp files.
      Projects are exported out as .zip files and imported in as such within Manager. Once you import the project, you should see the 4 pipelines.

  • We are able to use this to move stuff from our Stage or to our Prod org (and reversed) but we do not seem to be able to move to our UAT environment, which is where we need to test our pipeline against the forced upgrade November 10th.

    When we try, we get:
    The requested resource at the supplied path does not exist: /SVG-UAT/projects

    Any word on how to move resources to the UAT environment?

  • Is there any way to export as slp files without downloading the pipeline file?

    Basically I am looking for any automate way of export the entire/specific set of pipelines to S3 or other storage for backup.

    I don’t need any manual kind of import and export works? I need any best approach to get the slp file by automation.

    Please provide any suggestions? Thanks in advance.

  • Am developing a pipeline which can be used for Check out, check in to GIT and migrating the pipelines/task/account from GIT(not GIT HUB) to Snaplogic Prod org. I have created a Service account in snaplogic. I need to run the migration pipelines using the service account, so that any pipelines/task/account created in production snaplogic org is owned by service account user. How do I achieve that ?

    Mostly likely I will call migration pipeline via triggered task from ansible/Jenkins pipelines.

    How are you all migrating accounts from GIT. The account config will be different for dev/qa/prod

    @dshen - Can you help answer the questions ?

  • Would like to share an update:
    If the target org is set as a Trusted Org (please see steps in our doc),
    the user won’t need to re-enter the Account password in the target org.