You are viewing a plain text version of this content. The canonical link for it is here.
Posted to general@hadoop.apache.org by zjl208399617 <zj...@163.com> on 2012/11/02 09:59:59 UTC

File to large Error when MR

When i running Hive query option:
there often throw Error from Reduce Tasks:

Error: java.io.IOException: File too large
	at java.io.FileOutputStream.writeBytes(Native Method)
	at java.io.FileOutputStream.write(FileOutputStream.java:260)
	at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.write(RawLocalFileSystem.java:190)
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:65)
	at java.io.BufferedOutputStream.write(BufferedOutputStream.java:109)
	at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.write(FSDataOutputStream.java:49)
	at java.io.DataOutputStream.write(DataOutputStream.java:90)
	at org.apache.hadoop.mapred.IFileOutputStream.write(IFileOutputStream.java:84)
	at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.write(FSDataOutputStream.java:49)
	at java.io.DataOutputStream.write(DataOutputStream.java:90)
	at org.apache.hadoop.mapred.IFile$Writer.append(IFile.java:217)
	at org.apache.hadoop.mapred.Merger.writeFile(Merger.java:157)
	at org.apache.hadoop.mapred.ReduceTask$ReduceCopier$InMemFSMergeThread.doInMemMerge(ReduceTask.java:2560)
	at org.apache.hadoop.mapred.ReduceTask$ReduceCopier$InMemFSMergeThread.run(ReduceTask.java:2501)

what should i do ? 




zjl208399617

Re: File to large Error when MR

Posted by Andy Isaacson <ad...@cloudera.com>.
Moving the thread to user@. The general@ list is not used for
technical questions.

On Fri, Nov 2, 2012 at 1:59 AM, zjl208399617 <zj...@163.com> wrote:
> When i running Hive query option:
> there often throw Error from Reduce Tasks:
>
> Error: java.io.IOException: File too large
>         at java.io.FileOutputStream.writeBytes(Native Method)

This error is EFBIG from your filesystem. Either the file is too large
for your filesystem, or your job is trying to write to an invalid
large offset in the file.

What filesystem are you using? If it is ext3/4 the maximum filesize is
2TB. If you are using FAT/msdos the maximum file size is 4GB.

-andy

Re: File to large Error when MR

Posted by Andy Isaacson <ad...@cloudera.com>.
Moving the thread to user@. The general@ list is not used for
technical questions.

On Fri, Nov 2, 2012 at 1:59 AM, zjl208399617 <zj...@163.com> wrote:
> When i running Hive query option:
> there often throw Error from Reduce Tasks:
>
> Error: java.io.IOException: File too large
>         at java.io.FileOutputStream.writeBytes(Native Method)

This error is EFBIG from your filesystem. Either the file is too large
for your filesystem, or your job is trying to write to an invalid
large offset in the file.

What filesystem are you using? If it is ext3/4 the maximum filesize is
2TB. If you are using FAT/msdos the maximum file size is 4GB.

-andy

Re: File to large Error when MR

Posted by Andy Isaacson <ad...@cloudera.com>.
Moving the thread to user@. The general@ list is not used for
technical questions.

On Fri, Nov 2, 2012 at 1:59 AM, zjl208399617 <zj...@163.com> wrote:
> When i running Hive query option:
> there often throw Error from Reduce Tasks:
>
> Error: java.io.IOException: File too large
>         at java.io.FileOutputStream.writeBytes(Native Method)

This error is EFBIG from your filesystem. Either the file is too large
for your filesystem, or your job is trying to write to an invalid
large offset in the file.

What filesystem are you using? If it is ext3/4 the maximum filesize is
2TB. If you are using FAT/msdos the maximum file size is 4GB.

-andy

Re: File to large Error when MR

Posted by Andy Isaacson <ad...@cloudera.com>.
Moving the thread to user@. The general@ list is not used for
technical questions.

On Fri, Nov 2, 2012 at 1:59 AM, zjl208399617 <zj...@163.com> wrote:
> When i running Hive query option:
> there often throw Error from Reduce Tasks:
>
> Error: java.io.IOException: File too large
>         at java.io.FileOutputStream.writeBytes(Native Method)

This error is EFBIG from your filesystem. Either the file is too large
for your filesystem, or your job is trying to write to an invalid
large offset in the file.

What filesystem are you using? If it is ext3/4 the maximum filesize is
2TB. If you are using FAT/msdos the maximum file size is 4GB.

-andy