Hi @svatada @ptaylor
I have another issue. Please let me know if you can help. When trying to load the BLOB data to Snowflake(through Snaplogic), some of the data is higher than max limit of 16 MB. The column for blob data is created as “VARBINARY” in Snowflake side. Do you know how we can store data > 16MB? Is there an option to compress data before loading in to S/F ?
`Error creating or writing input data to the temp table "WORKFLOW".MERGE_TEMP_TABLE_cf24af3e_3eaf_440e_a34e_55b9d8b6c635` `Resolution:`
`Please file a defect against the snap` `Reason:`
`Max LOB size (16777216) exceeded, actual size of parsed column is 69322752`
` File '@~/INPUTVIEW/ae9f011a-ae4e-4895-aed1-5f0693e4e974/snowflakebuklload_data_af7638ff-1e5e-4147-84da-dcac60fe63b3.tmp.gz', line 1, character 69322762`
` Row 1, column "MERGE_TEMP_TABLE_CF24AF3E_3EAF_440E_A34E_55B9D8B6C635"["$2":2]`
` If you would like to continue loading when an error is encountered, use other values such as 'SKIP_FILE' or 'CONTINUE' for the ON_ERROR option. For more information on loading options, please run 'info loading_data' in a SQL client.`
`Hide Details...`
`Snowflake - Bulk Upsert[5dc5175f62872f2f905a07fd_8fc8cf38-3a37-4fc1-b81e-d2dc72e6d006 -- 18e89a59-156f-459f-9cac-8b8bd30effac]`
`com.snaplogic.snap.api.SnapDataException: Error creating or writing input data to the temp table "WORKFLOW".MERGE_TEMP_TABLE_cf24af3e_3eaf_440e_a34e_55b9d8b6c635`
Thanks for your help.