ContributionsMost RecentMost LikesSolutionsRe: Issue reading XML data Akash_Srivastava: <?xml version="1.0" encoding="utf-8"?> EUR Project Code changed 3.53 USD 242.11 INR 3.03 EUR If you are putting this in the template, you need to put it in the right xml format. Example: <?xml version="1.0" encoding="utf-8"?> < root > EUR Project Code changed 3.53 USD 242.11 INR 3.03 EUR < /root > Re: API that catches HTTP Post encoded in CXML ctlarson: input views only accept JSON-encoded da If you don’t know if the incoming request will be xml or json and you want your API to be capable of consuming both requests, here is a sample pipeline where you can do that. In the binary router you are checking if the pipeline request is xml or json and after that deciding which branch to process.XMLToJson_2019_05_16.slp (6.7 KB) Workday Action Based Write Submitted by @stodoroska from Interworks This pipeline determines which Workday objects to populate based on the action flags. Configuration Based on the action flags, the pipeline determines which Workday objects to populate. You can choose the flags or the conditions by which you will decide the flag values and Workday objects. Sources: CSV file on the file sharing system Targets: Workday Snaps used: Workday Write, PGP Decrypt, File Write, File Read, Router, Mapper, Union Downloads Workday Action Based Write.slp (89.4 KB) Salesforce to Azure Data Lake Storage Submitted by @stodoroska from Interworks This pipeline selects opportunities from Salesforce, matches them according to their types and stores them as a csv file into Azure Data Lake Storage. Configuration In SalesForce you should configure an account. In the File Writer, you need to define the location of the Azure Data Lake Storage and Azure account. Sources: Salesforce Opportunity Targets: File (Azure Data Lake Storage) Snaps used: Salesforce Read, Join, Filter, CSV Formatter, File Writer Downloads SFDC Joined Data to ADLS.slp (9.9 KB) Re: Using Github as a code repository for SnapLogic artifacts Anyone tried to migrate project from GIT tag to a different organization that the current one where the pipeline is running and migrate the files to the upper organization(example prod)? Succeeded migrating everything except files if they are stored on GIT and the migration pipeline from DEV org to PROD org is on DEV, then the pipeline is not migrating(creating) the files on the upper org. MQTT to Azure Data Lake Submitted by @stodoroska from Interworks Listens to messages on MQTT queue, converts them into JSON messages and writes them to Azure Data Lake Configuration You need to configure respective MQTT queue and account with server and also Azure account and ADLS location. Sources: MQTT queue Targets: Azure Data Lake Snaps used: MQTT Consumer, JSON parser, JSON Formatter, File Writer Downloads MQTT Consumer to Azure.slp (5.8 KB) File Transfer from SFTP to SMB server or any other two file sharing systems Submitted by @stodoroska from Interworks File Transfer from SFTP to SMB server or any other two file sharing systems with additional validations for file browsing such as checking if only valid file names are consumed from the pipeline, with valid dates and etc. Parent Pipeline Child Pipeline Configuration In the parameters, you need to configure: the file names you want to filter out from the directory, the pattern, source and target systems, and accounts. Also, the validation rules are specific for this pipeline such as filtering only specific dates. This can be changed anytime. Sources: File Targets: SMB server or any other file sharing system Snaps used: Directory Browser, Pipeline Execute, Group By Field, Mapper, JSON Splitter, Router, Filter, File Reader, File Writer, File Operation Downloads FileTransfer_SFTP2SMB.slp (18.7 KB) CheckUniqueFileNames.slp (4.7 KB) Error Handler for Validation Submitted by @stodoroska from Interworks This pattern should be called as an error pipeline and stores all the error records in to a csv file. Configuration Parent pipeline should be made that will invoke this as a error pipeline. Target location is pipeline parameter. No need for changes in the code. Sources: Pipeline that will call this pipeline and will send JSON-formatted data Targets: CSV file on the file sharing system Snaps used: Mapper, CSV Formatter, File Writer Downloads Error Handler for Validation.slp (7.2 KB) Re: Archiving Files Yes, but it is not working with smb server. Re: SOAP Customize Envelope You can use Apache velocity to do foreach, if and etc. Example: <SOAP-ENV:Envelope xmlns:SOAP-ENV="http://schemas.xmlsoap.org/soap/envelope/" xmlns:ns1="urn:general_2015_2.transactions.webservices.netsuite.com" xmlns:ns2="http://www.w3.org/2001/XMLSchema-instance" xmlns:ns0="urn:messages_2015_2.platform.webservices.netsuite.com" xmlns:ns3="urn:core_2015_2.platform.webservices.netsuite.com" xmlns:ns4="urn:core_2015_2.platform.webservices.netsuite.com"> <SOAP-ENV:Header> <ns0:applicationInfo> <ns0:applicationId>$Soap_Execute_applicationId</ns0:applicationId> </ns0:applicationInfo> <ns0:passport> <ns4:email>$Soap_Execute_Passport_email</ns4:email> <ns4:password>$Soap_Execute_Passport_password</ns4:password> <ns4:account>$Soap_Execute_Passport_account</ns4:account> </ns0:passport> </SOAP-ENV:Header> <SOAP-ENV:Body> <ns0:add> <ns0:record ns2:type="$record.journalEntry"> <ns1:customForm internalId="$record.customForm.internalId" /> <ns1:postingPeriod internalId="$record.postingPeriod.internalId"/> <ns1:tranDate>$record.tranDate</ns1:tranDate> <ns1:tranId>$record.tranId</ns1:tranId> <ns1:createdFrom internalId="$record.createdFrom.internalId"/> <ns1:subsidiary internalId="$record.subsidiary.internalId"/> <ns1:reversalDate>$record.reversalDate</ns1:reversalDate> <ns1:reversalDefer>$record.reversalDefer</ns1:reversalDefer> <ns1:approved>$record.approved</ns1:approved> <ns1:memo>$record.memo</ns1:memo> <ns1:customFieldList> <ns3:customField ns3:scriptId="$record.batchId.scriptId" ns3:internalId="$record.batchId.intId" ns2:type="$record.batchId.type"> <ns3:value>$record.custbody_batch_id</ns3:value> </ns3:customField> <ns3:customField ns3:scriptId="$record.customFieldList.customField[0].scriptId" ns:internalId="$record.customFieldList.customField[0].internalId" ns2:type="ns3:SelectCustomFieldRef"> <ns3:value>$record.customFieldList.customField[0].value.internalId</ns3:value> </ns3:customField> </ns1:customFieldList> <ns1:lineList> #set ($a = -1) #foreach($line in $record.lineList.line) <ns1:line> #set ($a = $a+1) <ns1:account internalId="#foreach($account in $record.lineList.line[$a].account)$account#end"></ns1:account> <ns1:debit>$record.lineList.line[$a].debit</ns1:debit> <ns1:credit>$record.lineList.line[$a].credit</ns1:credit> <ns1:memo>$record.lineList.line[$a].memo</ns1:memo> <ns1:custbody_custom_jnl_id>$record.lineList.line[$a].custbody_custom_jnl_id</ns1:custbody_custom_jnl_id> <ns1:department externalId="#foreach($department in $record.lineList.line[$a].department)$department#end"/> <ns1:class internalId="#foreach($class in $record.lineList.line[$a].class)$class#end"/> <ns1:customFieldList> <ns3:customField ns3:scriptId="$record.product.scriptId" ns3:internalId="$record.product.intId" ns2:type="$record.product.type"> <ns3:value internalId="#foreach($value in $record.lineList.line[$a].customFieldList.customField[0].value)$value#end"></ns3:value> </ns3:customField> </ns1:customFieldList> </ns1:line>#end </ns1:lineList> </ns0:record> </ns0:add> </SOAP-ENV:Body> </SOAP-ENV:Envelope>