03-17-2022 10:06 AM
I’m planning to fetch the data from a table using “Snowflake select snap” while putting the condition of getting 1k records ONLY in WHERE clause however it is not getting any results.
I can put the entire query in “Snowflake execute snap” as well but wanting to do with Snowflake Select as of now. Any help is highly appreciable.
Thanks!
Solved! Go to Solution.
03-23-2022 06:08 AM
Wanted to share the solution here. The query was right, no need to change the “Fetch-size” as well.
The only thing that was missing was ’ ’ for the IDs that we were passing in the query. Thus, the query would look like -
WHERE NAME IN (‘102345’, ‘1701’, ‘5604878’)
04-26-2022 02:58 PM
Snowflake lookup is definitely helpful and infact snowflake execute does wonders too if joins are used in the query itself however with both snowflake lookup and execute, I have seen the performance of the pipeline getting degraded.
The pipeline that used to complete within a minute, now takes around ~35-40m just to go over lookup/execute respectively.
04-29-2022 06:17 AM
@robin @bojanvelevski: Would you recommend using Join Snap
instead of snowflake lookup & execute
for a faster execution of the pipeline?
Please feel free to provide any other suggestions, trust me, with those suggestions of yours, I would be exposed to something different (I’m still new to snapLogic and learning my way out)
05-04-2022 01:48 PM
The database lookup pattern would work well with an OLTP database, which has low query startup costs and is optimized for single record operations. With a data warehouse like Snowflake which is optimized for analytical queries, the query startup cost and id based lookup will not be as performant as an OLTP database.
You could fetch the required column of the data set from Snowflake using a select query and then use the In-Memory Lookup snap. If the lookup table fits in memory, that would be more performant. If the data set is too large, using a OLTP database would be better for such an use case.