cancel
Showing results for 
Search instead for 
Did you mean: 

Head/Tail snap dynamically setting Offset value

mithsrini
New Contributor II

Hello,

I am working on loading data from big excel file into a database. I want to use Head/Tail snaps to dynamically take “Documents offset*” from upstream. Please suggest if there is an alternative to this approach.

Thanks,
Mithila

6 REPLIES 6

koryknick
Employee
Employee

Did you look at the Excel Parser “Start row” and “End row” properties?

mithsrini
New Contributor II

Sure I will try that. Thanks for quick response.

Thanks,

Mithila

mithsrini
New Contributor II

But this is effective and calling all the records in the excel file. Basically how to repeatedly call all the rows by looping? I want to chunk the data load, but need to read all the data.

koryknick
Employee
Employee

Personally, I would simply read from the Excel file and load into the database without trying to “chunk” the file. Your database account should have a “batch size” configuration that provides record checkpoint commits. If there is a concern that the pipeline may error or time out before complete, you can use a pipeline parameter to set the “start row” property on the parser so you can recover from the fail point without reprocessing records that have already loaded into the table.

Looping is a procedural concept… SnapLogic works in data streams. However, if you feel strongly about chunking the data load, you can use Pipeline Execute and call a child pipeline to process a sub-portion of the input file. Keep in mind that it will be reading the file multiple times to get to the particular record number you are starting from in each child sub-process.