You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Keith Fisher <kp...@gmail.com> on 2008/07/23 15:50:21 UTC

DFSClient java.io.IOException: Too many open files

I'm running hadoop version 0.17.0 on a Red Hat Enterprise Linux 4.4
box. I'm using an IBM provided JDK 1.5. I've configured Hadoop for a
localhost.

I've written a simple test to open and write to files in HDFS. I close
the output stream after I write 10 bytes to the file. After 471 files,
I see an exception from DFSClient in the log4j logs for my test:

Exception in createBlockOutputStream java.io.IOException: Too many open files
Abandoning block blk_ .....
DataStreamer Exception: java.net.SocketException: Too many open files
Error recovery for block_ .... bad datanode[0]

I'd appreciate any suggestions on how to resolve this problem.

Thanks.

Re: DFSClient java.io.IOException: Too many open files

Posted by Jason Venner <ja...@attributor.com>.
In /etc/security/limits.conf change the per user or the system default 
number of open files

adding this line

*       hard    nofile  65536

will allow any user to open up 65536 files.


Keith Fisher wrote:
> I'm running hadoop version 0.17.0 on a Red Hat Enterprise Linux 4.4
> box. I'm using an IBM provided JDK 1.5. I've configured Hadoop for a
> localhost.
>
> I've written a simple test to open and write to files in HDFS. I close
> the output stream after I write 10 bytes to the file. After 471 files,
> I see an exception from DFSClient in the log4j logs for my test:
>
> Exception in createBlockOutputStream java.io.IOException: Too many open files
> Abandoning block blk_ .....
> DataStreamer Exception: java.net.SocketException: Too many open files
> Error recovery for block_ .... bad datanode[0]
>
> I'd appreciate any suggestions on how to resolve this problem.
>
> Thanks.
>   
-- 
Jason Venner
Attributor - Program the Web <http://www.attributor.com/>
Attributor is hiring Hadoop Wranglers and coding wizards, contact if 
interested

Re: DFSClient java.io.IOException: Too many open files

Posted by Raghu Angadi <ra...@yahoo-inc.com>.
Check the limit on number of open files in your environment (ulimit 
command on the shell). You should increase the limit larger value like 8k.

Raghu.

Keith Fisher wrote:
> I'm running hadoop version 0.17.0 on a Red Hat Enterprise Linux 4.4
> box. I'm using an IBM provided JDK 1.5. I've configured Hadoop for a
> localhost.
> 
> I've written a simple test to open and write to files in HDFS. I close
> the output stream after I write 10 bytes to the file. After 471 files,
> I see an exception from DFSClient in the log4j logs for my test:
> 
> Exception in createBlockOutputStream java.io.IOException: Too many open files
> Abandoning block blk_ .....
> DataStreamer Exception: java.net.SocketException: Too many open files
> Error recovery for block_ .... bad datanode[0]
> 
> I'd appreciate any suggestions on how to resolve this problem.
> 
> Thanks.