ContributionsMost RecentMost LikesSolutionsDynamic Data Pivot Created by @dwhite An example of how to perform a dynamic pivot on data that needs to be pivoted. The traditional Pivot Snap is static and has to be configured per set of data (Ex. in pipeline). This pipeline shows how one can perform a pivot operation on data with variable fields that could be sent in at runtime instead, so pivot configuration could be done on the fly during the run. Configuration Configure dynamic pivot via parameter values. Enter the number of fields to split to in “nSplitFields” parameter. Enter the field names that are being split in the “splitFields” Parameter in a comma separated list. Enter new fields to generate in the “genFields” parameter in a comma separated list. For actual use, remove the sample data and traditional Pivot, those are only for demonstration and comparison. Sources: Any flat datasource that needs pivoting Targets: Any Snaps used: CSV Generator, Copy, Pivot, Mapper, Sequence, Join, JSON Splitter, Group by n Downloads Dynamic Data Pivot.slp (15.7 KB) Data Orchestration Process with Asynchronous Response Created by @dwhite A sample pattern one could use to trigger a large data orchestration process with several steps (synchronous and parallel) that has an asynchronous response at the start so the calling process will not wait. Response contains a SnapLogic Ruuid so one could call back later to check on status or log. Configuration Insert you child data warehousing process pipelines into the pipeline execute(s) as needed. Assign pipeline to a trigger task. Call from external application/process Sources: Any child pipeline process performing data loads / gets Targets: Any Snaps used: Pipeline Execute, Mapper, Copy Downloads Asynchronous Orchestration Process.slp (16.8 KB) Active Directory get User Data Endpoint Created by @dwhite Enter a base DN and an AD username to use LDAP search to find a specific user. Run as a trigger task endpoint or as a child pipeline. Configuration Enter the base dn to look for users in via the “baseDn” parameter. Enter the AD username in the “filterUser” parameter. Assign to trigger task and call via rest. Or assign to pipeline execute and run as a child as part of an orchestrator pipeline. Sources: Active Directory User Object Class Targets: REST call return Snaps used: LDAP Search, Filter, Mapper, Join, Union, Group by n Downloads AD Get User Data Endpoint.slp (14.9 KB) Re: REST Post Multipart Form-Data + File Upload Issue If I had to guess I’d say it was the cookie. Postman is auto-sending this for you because it is stateful probably captured it in the return of your first API call. In SnapLogic, each of the snaps execute separately, even if you have a chain of rest snaps they all function independently so you don’t have that luxury. You’d probably need to send the cookie which you’d probably get back in the header of your first call, and pass it as cookie header on calls 2/3. Without sending the cookie for the second response I’d guess the server wouldn’t be able to tie it to the first rest op so it’s just giving you a blank response back. Re: REST Post Multipart Form-Data + File Upload Issue Could you expand that temporary headers section in postman in the request header section? What kind of authorization are you using in both postman and SL? Also, have you checked the endpoint to see if the file from the SL post made it there regardless of the response body it’s giving back? Re: REST Post Multipart Form-Data + File Upload Issue In your response in SnapLogic, what http headers are you getting back? Also, what http headers are you sending in postman request? Re: REST Post Multipart Form-Data + File Upload Issue Looks like you’re giving the entity JSON, a target server expecting form-data isn’t going to know what to do with that. Try formatting your entity like ‘key1=’ + $yourData1 + ‘&key2=’ + $yourdata2 Do it right in the entity setting of the snap. Also, looks like your setting for Single File Upload: Multipart Content-Type is not set correctly. This field is for the MIME type of the file you’re trying to post, so when it gets formatted by the snap into a multipart request, it knows how to describe the file. So using “multipart/form-data” like you have set in the screenshot is not going to work. It looks like you’re posting an xlsx file. So in your case you can either use the specific mime type for the file “application/vnd.openxmlformats-officedocument.spreadsheetml.sheet” or preferably the generic binary stream content type “application/octet-stream”. By doing that you let the target server know what kind of file you’re posting. Try making those changes and see what your results are. Re: How to improve pipeline performance Replace your two scripts with mappers should net you some performance gain. You don’t need to use a script to make a string of json. Script in “Split events” pipeline which is doing data = self.input.next() newData = {} newData[‘record’] = data newData[‘sequence_id’] = data[‘sequence’][‘id’] newData[‘event_id’] = data[‘event_id’] self.output.write(newData) replace with a mapper like Script in write events is like data = self.input.next() self.output.write(data[‘record’]) replace with a mapper like Re: Task Changes Slow in Metadata Read Snap When you’re observing this behavior is it just in a preview? or the result of a run? It could be you just need to do a “retry” after your validate to refresh the cache so it will actually read the new metadata instead of using the cached version. It sounds like it’s just reading the cached version, and then when the cache expires you see the updated data. This wouldn’t affect any actual runs and just the data you see in the preview. Re: Web Service Security No. Any user with a standard account (has username and password) that has been granted at least Read + Execute Access to the project where the trigger task resides can run that trigger task using basic authentication. This can be a standard user or a service account. Service Accounts are setup like a normal user, and permissions are managed like a normal user, but it is unable to log in to the UI. Ultra only works with bearer auth.