cancel
Showing results for 
Search instead for 
Did you mean: 

Dynamic Data Validation

Ksivagurunathan
Contributor

I want to validate the incoming data which will be in CSV format. I have Target table defined for each file. Now I want to develop a common pipeline which should validate the data based on the rule for each file.
Validation Rules - Not null, length and datatype. I can extract these validation information from target database dynamically using sql queries based on the file name. How do I merge this rules with incoming data ? My goal is to develop one pipeline and even if there are changes in database for new field addition or datatype change, I shouldn’t change my pipeline. my sql queries can pull the latest fields, datatype and other req information.

Basically i’m trying to verify the data in the file before I apply them to target table. So I capture error records before I perform my DML operation

9 REPLIES 9

philliperamos
Contributor

I’m trying to do something similar.
Did you find a solution to the problem of dynamic data validation?

yes, we stored our validation rules in some tables/files then used script snap to validate the data against the rule. There is no straight option available within snaplogic for dynamic data validation.

That sounds good.

I pivoted the data using as JSON, and then used a combo of JSON Splitter/Mapper and Data Validator to validate the data. I still had to hard code some JSON generation scripts, but that was the easy part.

Thanks!

the way we developed, we don’t need to make changes to the snaplogic pipeline if we want to validate different set of files. it could be new set of validation or few new column validation to the same set of files. all we need to do is add validation rule to the table and corresponding file name. its more dynamic validation of data and column name in a file

Thanks. We’re thinking of doing something similar for the dynamic validation approach, as the way I described above is leading to huge memory problems.