Forum Discussion
I have another issue. Please let me know if you can help. When trying to load the BLOB data to Snowflake(through Snaplogic), some of the data is higher than max limit of 16 MB. The column for blob data is created as “VARBINARY” in Snowflake side. Do you know how we can store data > 16MB? Is there an option to compress data before loading in to S/F ?
`Error creating or writing input data to the temp table "WORKFLOW".MERGE_TEMP_TABLE_cf24af3e_3eaf_440e_a34e_55b9d8b6c635` `Resolution:`
`Please file a defect against the snap` `Reason:`
`Max LOB size (16777216) exceeded, actual size of parsed column is 69322752`
` File '@~/INPUTVIEW/ae9f011a-ae4e-4895-aed1-5f0693e4e974/snowflakebuklload_data_af7638ff-1e5e-4147-84da-dcac60fe63b3.tmp.gz', line 1, character 69322762`
` Row 1, column "MERGE_TEMP_TABLE_CF24AF3E_3EAF_440E_A34E_55B9D8B6C635"["$2":2]`
` If you would like to continue loading when an error is encountered, use other values such as 'SKIP_FILE' or 'CONTINUE' for the ON_ERROR option. For more information on loading options, please run 'info loading_data' in a SQL client.`
`Hide Details...`
`Snowflake - Bulk Upsert[5dc5175f62872f2f905a07fd_8fc8cf38-3a37-4fc1-b81e-d2dc72e6d006 -- 18e89a59-156f-459f-9cac-8b8bd30effac]`
`com.snaplogic.snap.api.SnapDataException: Error creating or writing input data to the temp table "WORKFLOW".MERGE_TEMP_TABLE_cf24af3e_3eaf_440e_a34e_55b9d8b6c635`
Thanks for your help.
Sorry, I cannot; I know next to nothing about Snowflake. I’d try some Google searches and then Snowflake community forums.
Thanks we will look into this. Is there a solution without ultra pipelines? we’re not planning on using ultra.
- nganapathiraju8 years agoFormer Employee
DO I NEED TO BUILD A SOAP SERVICE USING ULTRA?
Not necessarily:
Yes, if you intend to use the SOAPAction header to route the request (see initial version of the pipeline at
the beginning of this article).
No, if you intend to use the SOAP body (see simplified approach above)This is from the document posted above. Can you try that?