Fetching 1k records from table using "Snowflake select snap"

Hi Sir Darsh,

Good day, hope is all well.

Seems like a setting from the account that you are using in the Snowflake execute snap where a fetch size is a required setting. Hope this help

image

Thanks,
EmEm

1 Like

Thank you @alchemiz, I did check the account settings and fetch size is mandatory as you mentioned.
Can you walk me through on how to resolve this?

Do we need to ensure that Fetch size is non-mandatory or increase from 100 to some x number, if that’s the case, what would be that x number?
Thank you in advance…

Wanted to share the solution here. The query was right, no need to change the “Fetch-size” as well.
The only thing that was missing was ’ ’ for the IDs that we were passing in the query. Thus, the query would look like -
WHERE NAME IN (‘102345’, ‘1701’, ‘5604878’)

1 Like

The Snowflake Lookup snap covers this functionality. That will allow lookup on any number of ids, it will internally batch as required. Also, Lookup snap will use bind parameters, avoiding possible issues with SQL Injection which can happen with a SQL statement string constructed in the Select snap.

1 Like

Thank you @akidave for the suggestion and long term solution.

Snowflake lookup is definitely helpful and infact snowflake execute does wonders too if joins are used in the query itself however with both snowflake lookup and execute, I have seen the performance of the pipeline getting degraded.

The pipeline that used to complete within a minute, now takes around ~35-40m just to go over lookup/execute respectively.

@robin @bojanvelevski: Would you recommend using Join Snap instead of snowflake lookup & execute for a faster execution of the pipeline?

Please feel free to provide any other suggestions, trust me, with those suggestions of yours, I would be exposed to something different (I’m still new to snapLogic and learning my way out)

The database lookup pattern would work well with an OLTP database, which has low query startup costs and is optimized for single record operations. With a data warehouse like Snowflake which is optimized for analytical queries, the query startup cost and id based lookup will not be as performant as an OLTP database.

You could fetch the required column of the data set from Snowflake using a select query and then use the In-Memory Lookup snap. If the lookup table fits in memory, that would be more performant. If the data set is too large, using a OLTP database would be better for such an use case.

1 Like