This website uses Cookies. Click Accept to agree to our website's cookie use as described in our Privacy Policy. Click Preferences to customize your cookie settings.
Hello, I have a process to load million of data from one snowflake table to another snowflake table (possibly from different Databases within Snowflake due to security and sensitive data). Both the source and the target table has almost 300 column...
Hello, Below is the requirement where I have to read and parse a CSV file and load each line based on the first column's name. The file will have thousands and thousands of lines.I have the sample csv file below and 1st line will be Header and will ...
Hello, I am building an API that used PUT method to update a set of JSON data into my taget database. One of the contents of the PUT method is a boolean type that would receive ether a "true" or "false" and is a mandatory entity.I am using a Data Va...
Hello, I have a process where the files needed to be deleted from the SFTP folder after the files are copied and processed. At a given point of time there could be hundreds of files and after copying and processing these files, I have a separate pro...
Hello, I have a scenario where i use the Gate Snap and the output of the snap would be a empty array or list of entities in the array as shown below.If the array is empty, it would give output with "original" values ( i need this to take the values...
@koryknick , Thank you for multiple solutions. I tried both and the 2nd one seems to be more interesting by eliminating CSV parsers ( Realistically I will be using this methodology where the CSV file will have at least 15 to 20 different types of lin...
@SpiroTaleski , There are 2 things I wanted to update, the CSV doesnt have fixed length or fixed number of columns and each type (eg: Heasder, Summary, Submission) will have different number of columns and I dont want to trim the number of column, bu...