โ06-18-2021 02:03 PM
I am working on a Snaplogic pipeline to copy data from Oracle to Snowflake. A few tables at source have BLOB column which needs to be copied. Snaplogic sees them as โBase64 encoded dataโ but Snowflake needs the data in HEX format. I would like to know if its possible to convert the data as required and load into Snowflake tables. Iโve attached a sample pipeline.
TEST_2021_06_18.slp (7.1 KB)
โ06-21-2021 10:05 PM
@ADORAISWAMY: file format option
in our snap can help you here. If the incoming data is HEX format then use file format option `BINARY_FORMAT=HEX.
BINARY_FORMAT = XXX (where XXX = HEX | BASE64 | UTF8)
Reference:
[CREATE FILE FORMAT โ Snowflake Documentation](Snowflake Developer Documentation)
[https://docs-snaplogic.atlassian.net/wiki/spaces/SD/pages/1438603/Snowflake+-+Bulk+Upsert](Snowflake Snap Documentation)
โ06-22-2021 09:22 AM
โ06-22-2021 12:06 PM
I have another issue. Please let me know if you can help. When trying to load the BLOB data to Snowflake(through Snaplogic), some of the data is higher than max limit of 16 MB. The column for blob data is created as โVARBINARYโ in Snowflake side. Do you know how we can store data > 16MB? Is there an option to compress data before loading in to S/F ?
`Error creating or writing input data to the temp table "WORKFLOW".MERGE_TEMP_TABLE_cf24af3e_3eaf_440e_a34e_55b9d8b6c635` `Resolution:`
`Please file a defect against the snap` `Reason:`
`Max LOB size (16777216) exceeded, actual size of parsed column is 69322752`
` File '@~/INPUTVIEW/ae9f011a-ae4e-4895-aed1-5f0693e4e974/snowflakebuklload_data_af7638ff-1e5e-4147-84da-dcac60fe63b3.tmp.gz', line 1, character 69322762`
` Row 1, column "MERGE_TEMP_TABLE_CF24AF3E_3EAF_440E_A34E_55B9D8B6C635"["$2":2]`
` If you would like to continue loading when an error is encountered, use other values such as 'SKIP_FILE' or 'CONTINUE' for the ON_ERROR option. For more information on loading options, please run 'info loading_data' in a SQL client.`
`Hide Details...`
`Snowflake - Bulk Upsert[5dc5175f62872f2f905a07fd_8fc8cf38-3a37-4fc1-b81e-d2dc72e6d006 -- 18e89a59-156f-459f-9cac-8b8bd30effac]`
`com.snaplogic.snap.api.SnapDataException: Error creating or writing input data to the temp table "WORKFLOW".MERGE_TEMP_TABLE_cf24af3e_3eaf_440e_a34e_55b9d8b6c635`
Thanks for your help.
โ06-22-2021 08:16 PM
Sorry, I cannot; I know next to nothing about Snowflake. Iโd try some Google searches and then Snowflake community forums.
โ06-23-2021 12:51 PM
Since you are already writing to a BINARY column, you should be able to first compress the data, then base64 encode it using Base64.encode(GZip.compress($CONTENT))
Just FYI - based on Snowflake documentation, the BINARY/VARBINARY data type maximum length is 8MB, not 16MB. Hopefully the compression helps!