ContributionsMost RecentMost LikesSolutionsNeed Help to convert from XML to CSV Hello Team, I need to convert some xml file to csv or we have to map it flattern. I am not able to verify the xml docs in order to proceed further, I have attached the xml file. Can anyone please help me here and provide me the sample pipeline for same. AllScripts.txt (131.5 KB) Re: Using Github as a code repository for SnapLogic artifacts Hello @dmiller I got the solution by myself and able to integrate Gitlab and Github both from SnapLogic. Thanks for your info. Re: Using Github as a code repository for SnapLogic artifacts Hello @bhavin.patel I tried to using above attached slps but I keep on getting below error. "REST API service endpoint returned error result: status code = 404, reason phrase = Not Found, refer to the error_entity field in the error view document for more details “error_entity”: "{“message”:“Not Found”,“documentation_url”:"Repositories - GitHub Docs { “error”: “REST API service endpoint returned error result: status code = 404, reason phrase = Not Found” Steps I followed below : created account in Github let say “subhashchandra” is my user created a repository under that let say “SL_poc” is my repository under SL_POC I made the same folder structure like in SnapLogic org I have let say folder structure like this SnapLogicOrgName/Projectspacename/Foldername created the REST basic auth account in SnapLogic with github credential in SnapLogic REST GET snap selected the account and in url like below https://api.github.com/repos/subhashchandra/SL_POC/contents/SnapLogicOrgName/Projectspacename/Foldername/pipeline name user-Agent as header and provided my github username. Even from POST MAN app I also tried but no luck… Appreciate your help here. and tell me one more thing how differfent is if I want to do in gitlab instead github. if you have any sample code for gitlab that will be very helpful for me. Re: Pipeline Details with dependency from Dashboard thanks @bojanvelevski for your help Re: Script help Thanks robin for your suggetion. I have tried that I was able to access files. But still I wanted to access from script if something you have that will be good. Pipeline Details with dependency from Dashboard Hello Everyone, I have a requirement like there are aprox 500 pipelines are running already in production environment and every pipeline is calling multiple pipelines inside it to achieve the functionality. I have to create the excel spreadsheet which will have the information main pipeline and all child pipeline in series of execution **Let us suppose in the Projectspace 1 "A’ is the main pipeline and calling Child Pipeline ‘B’, ** ‘B’ is calling Child Pipeline 'C’and ‘D’ And ‘C’ is calling Pipeline ‘E’ And in Project Space 2 main Pipeline ‘X’ is calling Child pipeline ‘Y’ Then output should be like below |ProjectSpace|Main Piepline |Child Pipeline| |Project 1 | A | B | |Project 1 | B |C & D| |Project 1 | C |E | |Project 2 | X |Y | Script help Hello All, There is log file writting during execution of SnapLogic pipeline, as we dont have direct access so wanted to browse below directory from script snap. Can anyone help me here. Below is the javascript used for writ the logs:- try { load(“nashorn:mozilla_compat.js”); } catch (e) { } var impl = { input: input, output: output, error: error, log: log, execute: function () { while (this.input.hasNext()) { var doc = this.input.next(); this.output.write(doc, doc.response); var lastSystemTime = doc.lastSystemTime; var wrapper = new java.util.HashMap(); wrapper.putAll(doc); wrapper.remove("lastSystemTime"); var externalSystem = wrapper.externalSystem; if ((externalSystem != null) && (externalSystem.length > 0)) { if (! externalSystem[externalSystem.length - 1].endTime) { externalSystem[externalSystem.length - 1].endTime = lastSystemTime; } } wrapper.put("success", false); try { var om = new ObjectMapper(); om.writeValue(new java.io.File("/tmp/API_LOG/" + doc.executionId), wrapper); } catch (e) {} } } }; importPackage(com.snaplogic.scripting.language); importPackage(java.util.HashMap); importPackage(“com.fasterxml.jackson.databind”); importPackage(java.io.File); var hook = new com.snaplogic.scripting.language.ScriptHook(impl); Can any onne help me here to read above directory or files ?? Thanks in advanced. Timestamp format while reading Parquet file Hello Team, we have a requirement to read file from S3 Bucket and there we are getting source file as ‘Parquet’ file, As Parquet file stores data in Binary format so when i converted those files with the help of Online Parquet reader file then getting the field value ➖ create_dt : 8/21/2019 7:59:23 PM +00:00 updated_dt : 8/21/2019 7:59:24 PM +00:00 when we read those file from SnapLogic ‘Parquet reader file snap’ after reading the snap returns the value : $create_dt : ‘-7383172727504207099366988544’ $update_dt : ‘12649597609238856698939319552’ we need to process these data in Redshift with timestamp datatype. Thanks for help in advance. File transfer on sftp server through REST POST snap Hi Team, we have to transfer the flat csv file from one sftp server to another sftp server, but we dont have the access of target sftp server throuh winscp or filezila. Only option we have that we can transfer the file through REST post snap, Idea is on target server they will publish their end point url along with Bearer token and we have to call their API through REST POST snap along with our source csv flat file. Can anyone comment on this ? I am able to place file but its coming like below in target file 🙂 –exdaKoSupEie4I5GhXYNQ3_jw4olL1sVwO Content-Disposition: form-data; name=“file”; filename=“Test_File_Subhash_7.csv” Content-Type: text/csv; charset=UTF-8 Content-Transfer-Encoding: binary Test FIle 1 to test Test FIle 2 to test –exdaKoSupEie4I5GhXYNQ3_jw4olL1sVwO– But our original content is below 🙂 Test FIle 1 to test Test FIle 2 to test Connection with PECI connector Hi Team, we have one requirement where we have to extract the file from workday connector and pass that file to somewhere at third party sftp server in csv flat file after transformation… Is there any way to connect ?