You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by Markus Jelsma <ma...@openindex.io> on 2011/12/26 18:38:24 UTC

AlreadyExistsException for log file on 0.20.205.0

Hi,

We're sometimes seeing this exception if a map task already failed before due 
to, for example, an OOM error. Any ideas on how to address this issue?

org.apache.hadoop.io.SecureIOUtils$AlreadyExistsException: File 
/opt/hadoop/hadoop-0.20.205.0/libexec/../logs/userlogs/job_201112261420_0003/attempt_201112261420_0003_m_000029_1/log.tmp 
already exists
	at 
org.apache.hadoop.io.SecureIOUtils.insecureCreateForWrite(SecureIOUtils.java:130)
	at 
org.apache.hadoop.io.SecureIOUtils.createForWrite(SecureIOUtils.java:157)
	at org.apache.hadoop.mapred.TaskLog.writeToIndexFile(TaskLog.java:296)
	at org.apache.hadoop.mapred.TaskLog.syncLogs(TaskLog.java:369)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:257)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:396)
	at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
	at org.apache.hadoop.mapred.Child.main(Child.java:249)


Thanks

Re: AlreadyExistsException for log file on 0.20.205.0

Posted by Markus Jelsma <ma...@openindex.io>.
No, although i didn't say it in the other thread we had it disabled already on 
20.205 as we suspected a leak somewhere.

> Is this with jvm reuse turned on?
> 
> On Dec 26, 2011, at 9:38 AM, Markus Jelsma wrote:
> > Hi,
> > 
> > We're sometimes seeing this exception if a map task already failed before
> > due to, for example, an OOM error. Any ideas on how to address this
> > issue?
> > 
> > org.apache.hadoop.io.SecureIOUtils$AlreadyExistsException: File
> > /opt/hadoop/hadoop-0.20.205.0/libexec/../logs/userlogs/job_201112261420_0
> > 003/attempt_201112261420_0003_m_000029_1/log.tmp already exists
> > 
> > 	at
> > 
> > org.apache.hadoop.io.SecureIOUtils.insecureCreateForWrite(SecureIOUtils.j
> > ava:130)
> > 
> > 	at
> > 
> > org.apache.hadoop.io.SecureIOUtils.createForWrite(SecureIOUtils.java:157)
> > 
> > 	at org.apache.hadoop.mapred.TaskLog.writeToIndexFile(TaskLog.java:296)
> > 	at org.apache.hadoop.mapred.TaskLog.syncLogs(TaskLog.java:369)
> > 	at org.apache.hadoop.mapred.Child$4.run(Child.java:257)
> > 	at java.security.AccessController.doPrivileged(Native Method)
> > 	at javax.security.auth.Subject.doAs(Subject.java:396)
> > 	at
> > 
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation
> > .java:1059)
> > 
> > 	at org.apache.hadoop.mapred.Child.main(Child.java:249)
> > 
> > Thanks

Re: AlreadyExistsException for log file on 0.20.205.0

Posted by Arun C Murthy <ac...@hortonworks.com>.
Is this with jvm reuse turned on?

On Dec 26, 2011, at 9:38 AM, Markus Jelsma wrote:

> Hi,
> 
> We're sometimes seeing this exception if a map task already failed before due 
> to, for example, an OOM error. Any ideas on how to address this issue?
> 
> org.apache.hadoop.io.SecureIOUtils$AlreadyExistsException: File 
> /opt/hadoop/hadoop-0.20.205.0/libexec/../logs/userlogs/job_201112261420_0003/attempt_201112261420_0003_m_000029_1/log.tmp 
> already exists
> 	at 
> org.apache.hadoop.io.SecureIOUtils.insecureCreateForWrite(SecureIOUtils.java:130)
> 	at 
> org.apache.hadoop.io.SecureIOUtils.createForWrite(SecureIOUtils.java:157)
> 	at org.apache.hadoop.mapred.TaskLog.writeToIndexFile(TaskLog.java:296)
> 	at org.apache.hadoop.mapred.TaskLog.syncLogs(TaskLog.java:369)
> 	at org.apache.hadoop.mapred.Child$4.run(Child.java:257)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:396)
> 	at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1059)
> 	at org.apache.hadoop.mapred.Child.main(Child.java:249)
> 
> 
> Thanks