cancel
Showing results for 
Search instead for 
Did you mean: 

Inserting large data in servicenow

deepanshu_1
New Contributor III
Hello Team,
 
I am developing a pipeline in SnapLogic where there are 6000000 records coming from snowflake and i have designed my pipeline like this:
 
Parent pipeline: snowflake execute -> mapper where i have mapped one to one field -> group by n with 10000 group size -> pipeline execute where Pool size is 5 and in child pipeline i have used json spliter and service now insert ?
 
what can i do to optimize the performance and make it execute faster in snaplogic, currently it takes much time to execute ?
 
Can someone assist in this regards?
 
Thanks in advance.
 
2 REPLIES 2

koryknick
Employee
Employee

@deepanshu_1 - You can relieve some of the memory requirement off your snaplex nodes by removing the Group By N and JSON Splitter snaps and use the Batch Size option in the Pipeline Execute snap instead.  This accomplishes exactly the same result without the memory consumption of combining large sets of records into a single document as an array.

If you are still experiencing slowness, you can follow-up with your ServiceNow admins to see if anything can be done on that side.  I believe that ServiceNow is not really meant for batch operations, so inserting millions of records into ServiceNow is probably your bottleneck.

ssapa
Employee
Employee

@deepanshu_1 The ServiceNow Insert Snap supports batching. Did you use the Page Size option to set the batch size? Please try increasing the Page Size value and see if that helps.