How to make a Pipeline wait during execution
This can be achieved using Python script for making a pipeline to wait in the flow during execution. Change the value in line number 22 in the script time.sleep(120) Here 120 is the seconds which defined to make the pipeline wait. Note: The value needs to be provided in seconds Phython script Script_to_Wait_pipline.zip (723 Bytes) Pipeline Make Pipeline to wait_2017_03_06.slp (4.0 KB)8.1KViews3likes1CommentElastic Mapping with a simple Script Snap
An Elastic Mapper which outputs JSON that only contains keys that have non-null values Background: The use case came while trying to map fields from a Salesforce Outbound Message. This message would send a varying number of fields, from each record, each having non-null values. For example, the SFDC Object had 50 fields in total and one record would return a document with 35 of the 50 possible fields populated and the next record would send 25 and so on; the data is then moved to another database that is pre-populated with data for each record. The problem was that using a Mapper, all 50 fields would need to be mapped in advance, and therefore those fields with NULL values would potentially overwrite data when updating the database. Our Mapper is static. Target Schemas need to be defined up front. The problem to solve was how to create a Mapper that filters out the non-null fields from each input document so that the update can occur without overwriting data with the null values. The Magic: All you need is ONE line of code added to the ‘try’ block of the script snap (javascript in this case) that removes from the input map any elements that have null values: … this.log.info("Executing Transform Script"); while (this.input.hasNext()) { try{ // Read the next document, wrap it in a map and write out the wrapper var doc = this.input.next(); while (doc.values().remove(null)); var wrapper = new java.util.HashMap(); wrapper.put("original", doc); this.output.write(doc, doc); //ß Optional note: I modified this to flatten the output this.log.info("Transform Script finished"); } … ** In Python you can use this construct in two lines (please note the indentation required in python): while (in_doc.values().remove(None)): None See attached test case pipeline: Limitations: This was created to work only with flattened, non-hierarchical structures. **Elastic Mapping_2017_03_01.slp (9.6 KB)6.4KViews2likes5CommentsFlatten JSON files into CSV files
Created by @schelluri The pipeline pattern flattens a JSON file, which has multiple objects, and turns it into a CSV file. Configuration Sources: JSON Generator Targets: CSV file Snaps used: JSON Generator, JSON Formatter, JSON Parser, Script, CSV Formatter, File Writer Downloads MS_Flatten_Script.slp (31.4 KB)7.4KViews1like2CommentsRemote file renaming/moving
Hi, I have a requirement to push a file to SFTP server. But since the file is quite large and to avoid race condition (we writing and remote function reading at the same time) we like to write the file with different extension or to a temp folder in remote server and them rename or move the file at the server. None for the File snaps support these feature. Could you please help me identify a way around to address above problem.7.4KViews1like13CommentsCall a command line utility on the Snaplex node
Created by @pkona This pipeline pattern uses the Script Snap (here labeled Execute Script Snap) to call a command line utility on the Snaplex node. This pipeline calls any allowed shell command on the Snaplex node and the command is executed as the Snap user that is running the JCC process. In this sample the command can be configured in the Mapper Snap. Configuration Sources: Python Script Targets: JSON File Snaps used: Mapper, Script, JSON Formatter, File Writer Downloads Pattern to call-shell-command using Script Snap on Snaplex node.slp (6.8 KB)5.5KViews1like2CommentsScript snap with ultra pipelines: original property mapping to doc which is modified
Hi, I have a question regarding this statement from the SnapLogic documentation on Ultra Pipeline tasks (https://docs-snaplogic.atlassian.net/wiki/display/SD/Ultra+Pipeline+Tasks): Script and Execute Script Snaps need you to pass the original document to the ‘write()’ method for the output view. I am using a script snap in a pipeline to modify the original document to move some properties around (thereby changing the original document). I am wrapping my original document in a wrapper and putting it in the original property as demonstrated in the sample Script snap. However, “original” contains my modified doc. Will this be a problem if I try to use Ultra? A snippet from my Script snap is included below: execute : function () { this.log.info("Executing Transform Script"); while (this.input.hasNext()) { try{ // Read the next document, wrap it in a map and write out the wrapper var doc = this.input.next(); var wrapper = new java.util.HashMap(); wrapper.put("original", doc.clone()); wrapper.put("integrationMessage", doc["integrationMessage"]); var studentGradesWrapper = doc["integrationMessage"]["body"]["gradingSheetInfo"]["gradingSheet"]["studentGrades"]; var supplementalInfosWrapper = doc["integrationMessage"]["body"]["gradingSheetInfo"]["supplementalInfos"]; ensureArray(studentGradesWrapper,"studentGrade"); ensureArray(supplementalInfosWrapper, "supplementalInfo"); var studentIdsToCslwIdsMap = new java.util.HashMap(); supplementalInfosWrapper["supplementalInfo"].forEach( function(supplementalInfo) { studentIdsToCslwIdsMap.put(supplementalInfo["studentId"], supplementalInfo["coursesectionLW"]["objectId"]); } ); studentGradesWrapper["studentGrade"].forEach( function(studentGrade) { studentGrade.put("snapLogicCourseSectionLWId",studentIdsToCslwIdsMap.get(studentGrade["student"]["objectId"])); } ); this.output.write(doc, wrapper); } catch(err) { var wrapper = new java.util.HashMap(); wrapper.put("errorMsg", err); this.log.error(err); this.error.write(wrapper); } } }2.7KViews1like1Comment