You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by unmesha sreeveni <un...@gmail.com> on 2014/01/07 09:49:05 UTC

FAILED EMFILE: Too many open files

While i am trying to run a MR Job I am getting
" FAILED EMFILE: Too many open files "
 at org.apache.hadoop.io.nativeio.NativeIO.open(Native Method)
at org.apache.hadoop.io.SecureIOUtils.createForWrite(SecureIOUtils.java:172)
 at org.apache.hadoop.mapred.TaskLog.writeToIndexFile(TaskLog.java:310)
at org.apache.hadoop.mapred.TaskLog.syncLogs(TaskLog.java:383)
 at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:415)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
 at org.apache.hadoop.mapred.Child.main(Child.java:262)

Why is it so?

-- 
*Thanks & Regards*

Unmesha Sreeveni U.B
Junior Developer

http://www.unmeshasreeveni.blogspot.in/

Re: FAILED EMFILE: Too many open files

Posted by Akira AJISAKA <aj...@oss.nttdata.co.jp>.
The number of files a single user/process can have open is limited.
You can increase the limit by editing /etc/security/limits.conf or 
ulimit command.

For more details, see this wiki page.
http://wiki.apache.org/hadoop/TooManyOpenFiles

(2014/01/07 17:49), unmesha sreeveni wrote:
>
> While i am trying to run a MR Job I am getting
> " FAILED EMFILE: Too many open files "
> at org.apache.hadoop.io.nativeio.NativeIO.open(Native Method)
> at org.apache.hadoop.io.SecureIOUtils.createForWrite(SecureIOUtils.java:172)
> at org.apache.hadoop.mapred.TaskLog.writeToIndexFile(TaskLog.java:310)
> at org.apache.hadoop.mapred.TaskLog.syncLogs(TaskLog.java:383)
> at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
> at org.apache.hadoop.mapred.Child.main(Child.java:262)
>
> Why is it so?
>
> --
> /Thanks & Regards/
> /
> /
> Unmesha Sreeveni U.B/
> /
> Junior Developer
>
> http://www.unmeshasreeveni.blogspot.in/
>
> /
> /


Re: FAILED EMFILE: Too many open files

Posted by Akira AJISAKA <aj...@oss.nttdata.co.jp>.
The number of files a single user/process can have open is limited.
You can increase the limit by editing /etc/security/limits.conf or 
ulimit command.

For more details, see this wiki page.
http://wiki.apache.org/hadoop/TooManyOpenFiles

(2014/01/07 17:49), unmesha sreeveni wrote:
>
> While i am trying to run a MR Job I am getting
> " FAILED EMFILE: Too many open files "
> at org.apache.hadoop.io.nativeio.NativeIO.open(Native Method)
> at org.apache.hadoop.io.SecureIOUtils.createForWrite(SecureIOUtils.java:172)
> at org.apache.hadoop.mapred.TaskLog.writeToIndexFile(TaskLog.java:310)
> at org.apache.hadoop.mapred.TaskLog.syncLogs(TaskLog.java:383)
> at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
> at org.apache.hadoop.mapred.Child.main(Child.java:262)
>
> Why is it so?
>
> --
> /Thanks & Regards/
> /
> /
> Unmesha Sreeveni U.B/
> /
> Junior Developer
>
> http://www.unmeshasreeveni.blogspot.in/
>
> /
> /


Re: FAILED EMFILE: Too many open files

Posted by Akira AJISAKA <aj...@oss.nttdata.co.jp>.
The number of files a single user/process can have open is limited.
You can increase the limit by editing /etc/security/limits.conf or 
ulimit command.

For more details, see this wiki page.
http://wiki.apache.org/hadoop/TooManyOpenFiles

(2014/01/07 17:49), unmesha sreeveni wrote:
>
> While i am trying to run a MR Job I am getting
> " FAILED EMFILE: Too many open files "
> at org.apache.hadoop.io.nativeio.NativeIO.open(Native Method)
> at org.apache.hadoop.io.SecureIOUtils.createForWrite(SecureIOUtils.java:172)
> at org.apache.hadoop.mapred.TaskLog.writeToIndexFile(TaskLog.java:310)
> at org.apache.hadoop.mapred.TaskLog.syncLogs(TaskLog.java:383)
> at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
> at org.apache.hadoop.mapred.Child.main(Child.java:262)
>
> Why is it so?
>
> --
> /Thanks & Regards/
> /
> /
> Unmesha Sreeveni U.B/
> /
> Junior Developer
>
> http://www.unmeshasreeveni.blogspot.in/
>
> /
> /


Re: FAILED EMFILE: Too many open files

Posted by Akira AJISAKA <aj...@oss.nttdata.co.jp>.
The number of files a single user/process can have open is limited.
You can increase the limit by editing /etc/security/limits.conf or 
ulimit command.

For more details, see this wiki page.
http://wiki.apache.org/hadoop/TooManyOpenFiles

(2014/01/07 17:49), unmesha sreeveni wrote:
>
> While i am trying to run a MR Job I am getting
> " FAILED EMFILE: Too many open files "
> at org.apache.hadoop.io.nativeio.NativeIO.open(Native Method)
> at org.apache.hadoop.io.SecureIOUtils.createForWrite(SecureIOUtils.java:172)
> at org.apache.hadoop.mapred.TaskLog.writeToIndexFile(TaskLog.java:310)
> at org.apache.hadoop.mapred.TaskLog.syncLogs(TaskLog.java:383)
> at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
> at org.apache.hadoop.mapred.Child.main(Child.java:262)
>
> Why is it so?
>
> --
> /Thanks & Regards/
> /
> /
> Unmesha Sreeveni U.B/
> /
> Junior Developer
>
> http://www.unmeshasreeveni.blogspot.in/
>
> /
> /