cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Cannot read more than 2,147,483,647 into a byte array

rnarayan
New Contributor

Hello,

I am facing issue while reading a huge file (more than 2 GB) in size. I want to read the file and zip the file.I tried in 2 ways and in both i am getting the error.

Method 1: After file reader snap, I am using a Binary to Document (Encode or Decode is set to NONE and also tried with BYTE_TO_ARRAY)snap followed by a mapper and then
Document to Binary Snap(Encode or Decode is set to NONE and also tried with BYTE_TO_ARRAY) .

In mapper following expression is used to map the data :

$[โ€˜content-locationโ€™].substr([โ€˜content-locationโ€™].lastIndexOf(โ€œ/โ€) + 1) โ†’ $[โ€˜content-locationโ€™]
$content โ†’ $content

In this case I am getting โ€œCannot read more than 2,147,483,647 into a byte arrayโ€ error.

image

image

Method 2:

After File reader snap, in mapper i have used binary i/p and o/p view and used the following expression to map the data :

$[โ€˜content-locationโ€™].substr([โ€˜content-locationโ€™].lastIndexOf(โ€œ/โ€) + 1) โ†’ $[โ€˜content-locationโ€™]
$content โ†’ $content

image

image

After few seconds of execution, the pipeline is getting hung and I am getting error โ€œLost contact with Snaplex node while the pipeline was runningโ€

Is there something whcih I am missing ??

1 REPLY 1

rnarayan
New Contributor

I have found a solution for this, before converting the Binary Data to Document I have used a Compress snap to compress the data (GZIP) and then Decompress the data(GZIP) before writing into zip file writer.

This is not having any issue and the data is getting written to the file.

image

Hope this Helps !!

If anyone has other solution please let me knowโ€ฆ