You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by pvvpr <pv...@research.iiit.ac.in> on 2008/09/20 08:01:54 UTC

AlreadyBeingCreatedException in reduce

I am tried to run a nutch fetch job. I got this exception using 0.18.1-dev
during reduce phase. Not getting the exception for same job on 0.16.4 .
Any pointers where I am going wrong?

org.apache.hadoop.ipc.RemoteException:
org.apache.hadoop.hdfs.protocol.AlreadyBeingCreatedException: failed to
create file
/user/jobs/crawl/segments/20080917193538/crawl_fetch/part-00001/index for
DFSClient_attempt_200809161607_0012_r_000001_1 on client 67.215.230.24
because current leaseholder is trying to recreate file.
	at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1047)
	at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:990)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.create(NameNode.java:298)
	at sun.reflect.GeneratedMethodAccessor8.invoke(Unknown Source)
	at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:452)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:888)

	at org.apache.hadoop.ipc.Client.call(Client.java:707)
	at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:216)
	at $Proxy1.create(Unknown Source)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
	at
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
	at $Proxy1.create(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.(DFSClient.java:2432)
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:453)
	at
org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:170)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:485)
	at
org.apache.hadoop.io.SequenceFile$BlockCompressWriter.(SequenceFile.java:1198)
	at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:401)
	at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:306)
	at org.apache.hadoop.io.MapFile$Writer.(MapFile.java:160)
	at org.apache.hadoop.io.MapFile$Writer.(MapFile.java:134)
	at org.apache.hadoop.io.MapFile$Writer.(MapFile.java:92)
	at
org.apache.nutch.fetcher.FetcherOutputFormat.getRecordWriter(FetcherOutputFormat.java:66)
	at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:371)
	at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2359)

thanks and regards,
Prasad Pingali.