You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by "yingnan.ma" <yi...@ipinyou.com> on 2012/05/28 07:01:04 UTC

No space left on device

Hi,

I encounter a problem as following:

 Error - Job initialization failed:
org.apache.hadoop.fs.FSError: java.io.IOException: No space left on device

 at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.write(RawLocalFileSystem.java:201)
        at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:65)
        at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:123)
        at java.io.FilterOutputStream.close(FilterOutputStream.java:140)
        at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:61)
        at org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:86)
        at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer.close(ChecksumFileSystem.java:348)
        at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:61)
        at org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:86)
        at org.apache.hadoop.mapred.JobHistory$JobInfo.logSubmitted(JobHistory.java:1344)
        ......

So, I think that the HDFS is full or something, but I cannot find a way to address the problem, if you had some suggestion, Please show me , thank you.

Best Regards


 

Re: No space left on device

Posted by Marcos Ortiz <ml...@uci.cu>.
Do you have the JT and NN on the same node?
Look here on the Lars Francke´s post:
http://gbif.blogspot.com/2011/01/setting-up-hadoop-cluster-part-1-manual.html
This is a very schema how to install Hadoop, and look the configuration 
that he used for the name and data directories.
If this directories are in the same disk, and you don´t have enough 
space for it, you can find that exception.

My recomendation is to divide these directories in separate discs with a 
very similar schema to the Lars´s configuration
Another recomendation is to check the Hadoop´s logs. Read about this here:
http://www.cloudera.com/blog/2010/11/hadoop-log-location-and-retention/

regards

On 05/28/2012 02:20 AM, yingnan.ma wrote:
> ok,I find it. the jobtracker server is full.
>
>
> 2012-05-28
>
>
>
> yingnan.ma
>
>
>
> 发件人: yingnan.ma
> 发送时间: 2012-05-28  13:01:56
> 收件人: common-user
> 抄送:
> 主题: No space left on device
>
> Hi,
> I encounter a problem as following:
>   Error - Job initialization failed:
> org.apache.hadoop.fs.FSError: java.io.IOException: No space left on device
>   at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.write(RawLocalFileSystem.java:201)
>          at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:65)
>          at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:123)
>          at java.io.FilterOutputStream.close(FilterOutputStream.java:140)
>          at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:61)
>          at org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:86)
>          at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer.close(ChecksumFileSystem.java:348)
>          at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:61)
>          at org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:86)
>          at org.apache.hadoop.mapred.JobHistory$JobInfo.logSubmitted(JobHistory.java:1344)
>          ......
> So, I think that the HDFS is full or something, but I cannot find a way to address the problem, if you had some suggestion, Please show me , thank you.
> Best Regards

-- 
Marcos Luis Ortíz Valmaseda
  Data Engineer&&  Sr. System Administrator at UCI
  http://marcosluis2186.posterous.com
  http://www.linkedin.com/in/marcosluis2186
  Twitter: @marcosluis2186


10mo. ANIVERSARIO DE LA CREACION DE LA UNIVERSIDAD DE LAS CIENCIAS INFORMATICAS...
CONECTADOS AL FUTURO, CONECTADOS A LA REVOLUCION

http://www.uci.cu
http://www.facebook.com/universidad.uci
http://www.flickr.com/photos/universidad_uci

Re: No space left on device

Posted by "yingnan.ma" <yi...@ipinyou.com>.
ok,I find it. the jobtracker server is full.


2012-05-28 



yingnan.ma 



发件人: yingnan.ma 
发送时间: 2012-05-28  13:01:56 
收件人: common-user 
抄送: 
主题: No space left on device 
 
Hi,
I encounter a problem as following:
 Error - Job initialization failed:
org.apache.hadoop.fs.FSError: java.io.IOException: No space left on device
 at org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.write(RawLocalFileSystem.java:201)
        at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:65)
        at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:123)
        at java.io.FilterOutputStream.close(FilterOutputStream.java:140)
        at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:61)
        at org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:86)
        at org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSOutputSummer.close(ChecksumFileSystem.java:348)
        at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:61)
        at org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:86)
        at org.apache.hadoop.mapred.JobHistory$JobInfo.logSubmitted(JobHistory.java:1344)
        ......
So, I think that the HDFS is full or something, but I cannot find a way to address the problem, if you had some suggestion, Please show me , thank you.
Best Regards