cancel
Showing results for 
Search instead for 
Did you mean: 

Cannot read more than 2,147,483,647 into a byte array

rnarayan
New Contributor

Hello,

I am facing issue while reading a huge file (more than 2 GB) in size. I want to read the file and zip the file.I tried in 2 ways and in both i am getting the error.

Method 1: After file reader snap, I am using a Binary to Document (Encode or Decode is set to NONE and also tried with BYTE_TO_ARRAY)snap followed by a mapper and then
Document to Binary Snap(Encode or Decode is set to NONE and also tried with BYTE_TO_ARRAY) .

In mapper following expression is used to map the data :

$[‘content-location’].substr([‘content-location’].lastIndexOf(“/”) + 1) → $[‘content-location’]
$content → $content

In this case I am getting “Cannot read more than 2,147,483,647 into a byte array” error.

image

image

Method 2:

After File reader snap, in mapper i have used binary i/p and o/p view and used the following expression to map the data :

$[‘content-location’].substr([‘content-location’].lastIndexOf(“/”) + 1) → $[‘content-location’]
$content → $content

image

image

After few seconds of execution, the pipeline is getting hung and I am getting error “Lost contact with Snaplex node while the pipeline was running

Is there something whcih I am missing ??

1 REPLY 1

rnarayan
New Contributor

I have found a solution for this, before converting the Binary Data to Document I have used a Compress snap to compress the data (GZIP) and then Decompress the data(GZIP) before writing into zip file writer.

This is not having any issue and the data is getting written to the file.

image

Hope this Helps !!

If anyone has other solution please let me know…