ContributionsMost RecentMost LikesSolutionsConverting Objects with arrays into row format for DB storage Hi, I have issue converting an object with array to a format that could be inserted into a table. I have a following dynamic JSON document: “testcode”: { “1203”: [1,2,5,6,8,10,11,12,13,14,16,17], “1205”: [9], “1206”: [7,3,4,15] } Above, testcode is shown to be an object that contains array 1203, 1205, 1206. However these value changes so it is not really a fixed description. What I want to do in the above data is to store like this. See below. Using 1203, I want to see this format which contains two fields: 1203,1 1203,2 1203,5 1203,6 1203,8 1203,10 1203,11 1203,12 1203,13 1203,14 1203,16 1203,17 Is this possible? If not will store it as variant instead. Thanks, David Re: Problems with Date.parse() function Sure. It is actually the same date in my reply. It only fails because it returns NaN. The weird about it is that I have a similar job running the same way and it does not fail. Date.parse(“2021-03-14 02:40:18.000”,“yyyy-MM-dd HH:mm:ss”+“.000”) - the difference is the .000 at the end. Re: Problems with Date.parse() function Date.parse(“2021-03-14 02:40:18.000”,“yyyy-MM-dd HH:mm:ss.SSS”) Use this on my mapper and it worked. However, when I use same approach on my existing snaplogic job coming from a field and it fails. I will just do date manipulation inside SQL as standard functions in Snaplogic is buggy. Re: Problems with Date.parse() function I have a similar issue. When I use literal string, it works. But using a field, it does not. Additionally, it only happened yesterday 2021-03-14. This fails. This works. Download Snaplogic Pipelines and Tasks List Hi, We are trying to consolidate a list of our pipelines and tasks. Is there a way to download this from Snaplogic? Like from a repository or database? Has anyone done this in the past? Thanks, David Re: Loading JSON to Snowflake Figure this out. The important thing in using Variant is that you want to store JSON as JSON and not as string. To do this, I basically need to convert Document to Binary and Binary to Document. Strange, but it works. Re: Loading JSON to Snowflake Please note, I’m going to JIRA Search snap. Using JQL, I am able to return json documents and bulk uploading them to Snowflake using Snowflake Bulk Upload. Format Type: JSON. The table we are loading contains only 1 column that is data type VARIANT. Please note: I did try adding JSON Formatter and JSON Parser before loading it to Snowflake. It only worked for Snowflake Insert but instead of VARIANT (JSON) it was looking at JIRA as text. Re: Loading JSON to Snowflake I’m getting this error: No column values found in input document to load into SF database table. Resolution: All columns which do not have default values have to be specified in the input document. Reason: The input to the SF Bulk Load snap should contain the column values to load into SF database. Hide Details… Snowflake - Bulk Load[58b7a2d080b28239d980b31f_6d77b238-ce5f-4f6b-86cc-be57f461663c – 1e79a01b-9983-4bdb-a2ed-135917bc152f] `com.snaplogic.snap.api.SnapDataException: No column values found in input document to load into SF database table. at com.snaplogic.snaps.sql.SimpleSqlSnap.process(SimpleSqlSnap.java:393) at com.snaplogic.snaps.snowflake.BulkLoad.execute(BulkLoad.java:420) at com.snaplogic.cc.snap.common.SnapRunnableImpl.executeSnap(SnapRunnableImpl.java:768) at com.snaplogic.cc.snap.common.SnapRunnableImpl.execute(SnapRunnableImpl.java:550) at com.snaplogic.cc.snap.common.SnapRunnableImpl.doRun(SnapRunnableImpl.java:834) at com.snaplogic.cc.snap.common.SnapRunnableImpl.call(SnapRunnableImpl.java:400) at com.snaplogic.cc.snap.common.SnapRunnableImpl.call(SnapRunnableImpl.java:116) at java.base/java.util.concurrent.FutureTask.run(Unknown Source) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) at java.base/java.util.concurrent.FutureTask.run(Unknown Source) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) at java.base/java.lang.Thread.run(Unknown Source) Caused by: com.snaplogic.api.ExecutionException: No column values found in input document to load into SF database table. at com.snaplogic.snaps.snowflake.BulkLoad.storeInputDocument(BulkLoad.java:550) at com.snaplogic.snaps.snowflake.BulkLoad.processDocument(BulkLoad.java:484) at com.snaplogic.snaps.sql.SimpleSqlSnap.process(SimpleSqlSnap.java:384) … 12 more Reason: The input to the SF Bulk Load snap should contain the column values to load into SF database. Resolution: All columns which do not have default values have to be specified in the input document. Error Fingerprint[0] = efp:com.snaplogic.snaps.sql.8mD-ksio Error Fingerprint[1] = efp:com.snaplogic.snaps.snowflake.IT5dMonA` pool-4-thread-11817 `com.snaplogic.cc.snap.common.ThreadDetails: prio=4 Id=74189 TIMED_WAITING on java.util.concurrent.SynchronousQueue$TransferStack@773794a at java.base@11.0.5/jdk.internal.misc.Unsafe.park(Native Method) - waiting on java.util.concurrent.SynchronousQueue$TransferStack@773794a at java.base@11.0.5/java.util.concurrent.locks.LockSupport.parkNanos(Unknown Source) at java.base@11.0.5/java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(Unknown Source) at java.base@11.0.5/java.util.concurrent.SynchronousQueue$TransferStack.transfer(Unknown Source) at java.base@11.0.5/java.util.concurrent.SynchronousQueue.poll(Unknown Source) at java.base@11.0.5/java.util.concurrent.ThreadPoolExecutor.getTask(Unknown Source) at java.base@11.0.5/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) at java.base@11.0.5/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) … at java.base@11.0.5/jdk.internal.misc.Unsafe.park(Native Method) at java.base@11.0.5/java.util.concurrent.locks.LockSupport.parkNanos(Unknown Source) at java.base@11.0.5/java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(Unknown Source) at java.base@11.0.5/java.util.concurrent.SynchronousQueue$TransferStack.transfer(Unknown Source) at java.base@11.0.5/java.util.concurrent.SynchronousQueue.poll(Unknown Source) at java.base@11.0.5/java.util.concurrent.ThreadPoolExecutor.getTask(Unknown Source) at java.base@11.0.5/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) at java.base@11.0.5/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) at java.base@11.0.5/java.lang.Thread.run(Unknown Source) Error Fingerprint[0] = efp:jdk.internal.misc.rn-dMs9v` Loading JSON to Snowflake Hi, Been trying to load json data into VARIANT data field in Snowflake. I want to avoid writing externally like an S3 bucket and then doing a bulk upload. I prefer to do it straight load to Snowflake. I’ve tried the following snaps and it keeps failing. Snowflake Bulk Upload - fail Snowflake Bulk Merge - fail Snowflkae Insert - slow but it works (but had to remove all other fields ) My table is a mix of VARHCAR (3 columns) and 1 VARIANT. Question, I have is what is the suitable SNAP for loading JSON into Snowflake? Thanks, David