ContributionsMost RecentMost LikesSolutionsReverse Data Look Up We receive an XML document where I unpacked the records and flattened the arrays so we can write to a Microsoft Excel file. Each record should have an identifier but the XML document only has this identifier at the end of each group. I was able to get the identifier added to the last record of the group. Now, I need to add this identifier to each record above the previous record based on the number of records in that group. I do have the number of records that should be in each group which comes from the XML document. In the below example, account ab has identifier as 105 and is the only record in that group. For account ff having the identifier of 106, this identifier should be applied to account ds and account ff. For account asdf having an identifier of 556, this identifier should be applied to accounts ss through asdf and so on. Does anybody have any suggestions on how I can add the identifers to previous records based on the number of records for each group? Snowflake SCD2 - Joda Time Validation Failure I am trying to set up a slowly changing dimension process in Snaplogic using the Snowflake SCD2 snap. This process will read a file on a daily basis and provide historical data based on the status the record is currently in for that day’s run (I.E. Open, Reviewed, Pending, Closed, etc) so we can determine the number of days the record is in each of the statuses. I have added the columns the SCD process needs to store the historical information. I have the Snaplogic pipe line set up using the Snowflake SCD snap connecting to the Snowflake Azure DB account. When I process the first file, everything works and the data is loaded to the table as I expect. When I try to process another file, I am receiving the below error message and the pipeline will not validate. Does anybody know what this error is trying to tell me? I set up the SCD2 snap based on Snaplogic documentation. The Start and End Date calculation is just current date that I am converting to Eastern time. In reviewing the error message, I am not seeing any indication on the Start and End Time using any components of the joda time. The Auto Historization query is based on the LAST_CASE_UPDATE which is the date the status changes. This field is defined as Timestamp_NTZ(0). ERROR: Failure: Failed to close database cursor connection., Reason: class java.lang.String cannot be cast to class org.joda.time.DateTime (java.lang.String is in module java.base of loader ‘bootstrap’; org.joda.time.DateTime is in unnamed module of loader java.net.FactoryURLClassLoader @701fc37a), Resolution: Please ensure the database can still be accessed. FAILED VALIDATION: SCD FIELDS: Snowflake Table Start and End Date field Date.now().toLocaleDateTimeString({“timeZone”:“US/Eastern”,“format”:“yyyy-MM-dd hh:mm:ss”}) Re: Parent/Child Parameters Thank you again everybody for the assistance. I guess I overlooked hardcoding a value in the Child parameter. Once I added a hard coded value to the Child Parameter, I was able to validate with no issues. It seems like everything is working now as I expect them to. Re: Parent/Child Parameters Thank you everybody for the input/assistance. I made the changes to the Parent pipeline to use the $Path. I was not sure how to pass the $Path field directly to the child without Parameters or run the Child without the Pipe Execute snap. With this process I am setting up, I can have anywhere from 1 to 15 files a day go into the directory. I then modified the Child pipeline by removing the Mapper and then using the Parameter of _FileName in the reader. With this new configuration, I am finally able to get a successful validation. However, the File Reader snap is always red on validation. Is this normal? In the Child Pipeline screen shot you can see this is Red. I am able to run the pipelines and the data is being loaded successful based on record counts in the target table. Parent Pipeline with Changes Child Pipeline Child Pipeline with Parameter Re: Parent/Child Parameters This is what is in Mapper 1 of the Parent. Since I can see the file names in the Directory Browser, Mapper 1, and Pipeline Execute snaps of the Parent, I would think they would carry over to the Child. When I view the output of Mapper2 in the child there are no values. Should I see the values I am seeing from the Parent which should be the list of files that I can pass down the stream? How would I use the Parameters in the Child? In the File Reader of the Child, I’ve tried both _FileName and $FileName and keep getting an error and I think it has something to do with the Mapper 2 values not coming in from the Parent. Mapper 1 from Parent File Reader from Child Parent/Child Parameters I’m new to Snaplogic. I am working on a solution to pull files from a network share and load to a Snowflake table. I have a Parent set up to pull the file names from the network share using the directory browser snap which is going to the Pipeline Execute which then runs the Child pipeline. I am including the full path and filename in my Parameter. My issue is how to pass the parameters from the Parent to the Child. I have downloaded several examples of this from the boards and I still can’t get it to work. I can see the file names in the preview of the Pipeline Execute snap of the Parent pipeline. When I switch to the Child pipeline, the parameter is whatever is defined in the Pipeline Properties box when I do a preview. How do I pass the Parent parameters to the Child Parameters? The file writer is using smb://network drive location/shared/xxxxx/xxxx/file name which is why the equal sign is not depressed in the File Reader of the child snap as for some reason smb does not work with the equal sign depressed. Thanks for any assistance. Parent: Parent Pipeline Execute: Parent Pipeline Properties Child: Child Pipeline Properties: Mapper Snap from Child: Mapper Snap Child Preview