ContributionsMost RecentMost LikesSolutionsRe: How to receive file through snaplogic API call Hi Rudhiran -- the double quotes were escaped in my example. Not sure where those came from, but could you try this expression, with backslashes removed, in your mapper and see if it will work for you? $['content-disposition'].match('filename="(.*)"')[1] Re: How to create timestamps and out put them only using Script snap for Python? Hi Palermo, Here are some examples of how timestamps can be used within a Script snap. See the attached pipeline if you want to try it out. This segment of the pipeline shows one way to create a Timestamp from within the Python Script snap using the Java Instant class and passing the value to a downstream Mapper that will parse it back into a compatible Date type. #Import the interface required by the Script snap. from com.snaplogic.scripting.language import ScriptHook from java.time import Instant class TransformScript(ScriptHook): def __init__(self, input, output, error, log): self.input = input self.output = output self.error = error self.log = log # The "execute()" method is called once when the pipeline is started # and allowed to process its inputs or just send data to its outputs. def execute(self): self.log.info("Executing Transform script") while self.input.hasNext(): try: # Read the next input document, store it in a new dictionary, and write this as an output document. inDoc = self.input.next() outDoc = { 'original' : inDoc, 'ts': Instant.now().toEpochMilli() } self.output.write(inDoc, outDoc) except Exception as e: errDoc = { 'error' : str(e) } self.log.error("Error in python script") self.error.write(errDoc) self.log.info("Script executed") # The "cleanup()" method is called after the snap has exited the execute() method def cleanup(self): self.log.info("Cleaning up") # The Script Snap will look for a ScriptHook object in the "hook" # variable. The snap will then call the hook's "execute" method. hook = TransformScript(input, output, error, log) I included another segment in the pipeline that shows the opposite, eg. how to create a timestamp in the Mapper, then use it within a Script snap. I don't think that's exactly what you were after, but hopefully it may help others in their work. Hope this helps you. Re: Do JSON mapping in a single mapper Here are two ways you could accomplish this. The first is to copy the data into two streams, split, and then union it back up together. See the attached json-splitter-example for reference. The second is to use a Mapper to concat the two arrays together and then use a JsonSplitter afterwards to split on the new array so you have a document for each of the elements of the new array. See single-mapper-example as reference. Hope this is helpful. Re: How to know the all column names(input columns) using expression in mapper How are you using those columns downstream from the mapper? One performance improvement would be to only do the call to $.keys() once and use that information. However if you need to join it with each document, then I'm not sure that would improve anything. If you only use it in some kind of load/write operation, then maybe that would be an improvement for you. Re: How to know the all column names(input columns) using expression in mapper Can you talk more broadly about the problem you're trying to solve? If you went with this approach, then you could avoid doing for each row by routing a single document into the Mapper that finds the documents properties/column names. Re: Formatting a json Object If you want something a tad more generic and you know that you'll always have a Key 1 and Key 2 field, then you could use an expression like this. {[$["Key 1"]]: $["Key 2"]} Re: Flatten multilevel/nested JSON Objects and Arrays There is the AutoPrep snap that can handle this transformation. Do you have access to that? https://docs-snaplogic.atlassian.net/wiki/spaces/SD/pages/2568781882/AutoPrep Re: Sort not working..... The sort operates on a document by document basis, but the input to the snap contains only 1 document. In order to sort on the field you'd like you'll need to first split the document into multiple documents, sort those, then group them back together. Here's an example that demonstrates how you could do that sort of thing. Re: Logging mechanism in order to get parent pipeline name in logging pipeline I'd probably take the approach where the "root" pipeline (parent_pipeline_A) would pass it's label into the child, then the child would pass that in to the pipelineWithLogging. The Pipeline Execute snap can set a child pipeline's parameters and that would be a way to achieve this. There are other options, but this seems like a pretty straightforward approach. Re: What is the correct syntax for regular expression in Snaplogic Take a look at the Java regex documentation as that is what is used. If you need more help, then an example input and output along with the regex might make it easier for me to grok what you’re doing.