You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Dan Richelson <dr...@tendrilinc.com> on 2012/08/10 21:33:41 UTC

HDFS AlreadyBeingCreatedException while running hive script

Hi,
I am encountering an intermittent failure on my cluster related to an
HDFS leaseholder issue..
This script (and the oozie workflow that triggers it) can run just
fine most of the time..

Running CDH4 but with Hive 0.9.0 instead of included 0.8.x

Any help is appreciated,
Dan





org.apache.hadoop.hdfs.protocol.AlreadyBeingCreatedException: failed
to create file /tmp/hive-mapred/hive_2012-08-10_05-45-07_087_8229327593305391688/-mr-10006/1/emptyFile
for DFSClient_NONMAPREDUCE_-678411175_1 on client 192.168.219.142
because current leaseholder is trying to recreate file.
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.recoverLeaseInternal(FSNamesystem.java:1752)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1589)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1514)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:408)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:200)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:42590)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:427)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:916)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1692)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1688)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:396)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1686)

	at org.apache.hadoop.ipc.Client.call(Client.java:1161)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:184)
	at $Proxy10.create(Unknown Source)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:165)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:84)
	at $Proxy10.create(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:187)
	at org.apache.hadoop.hdfs.DFSOutputStream.<init>(DFSOutputStream.java:1250)
	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1269)
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1063)
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1021)
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:232)
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:75)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:806)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:787)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:686)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:675)
	at org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat.getHiveRecordWriter(HiveIgnoreKeyTextOutputFormat.java:80)
	at org.apache.hadoop.hive.ql.exec.ExecDriver.addInputPath(ExecDriver.java:842)
	at org.apache.hadoop.hive.ql.exec.ExecDriver.addInputPaths(ExecDriver.java:886)
	at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:409)
	at org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:137)
	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:134)
	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
	at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:47)
Job Submission failed with exception
'org.apache.hadoop.ipc.RemoteException(failed to create file
/tmp/hive-mapred/hive_2012-08-10_05-45-07_087_8229327593305391688/-mr-10006/1/emptyFile
for DFSClient_NONMAPREDUCE_-678411175_1 on client 192.168.219.142
because current leaseholder is trying to recreate file.
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.recoverLeaseInternal(FSNamesystem.java:1752)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:1589)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:1514)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:408)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:200)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:42590)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:427)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:916)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1692)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1688)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:396)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1686)
)'
java.lang.IllegalArgumentException: Can not create a Path from an empty string
	at org.apache.hadoop.fs.Path.checkPathArg(Path.java:91)
	at org.apache.hadoop.fs.Path.<init>(Path.java:99)
	at org.apache.hadoop.hive.ql.exec.Utilities.getHiveJobID(Utilities.java:380)
	at org.apache.hadoop.hive.ql.exec.Utilities.clearMapRedWork(Utilities.java:193)
	at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:459)
	at org.apache.hadoop.hive.ql.exec.MapRedTask.execute(MapRedTask.java:137)
	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:134)
	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
	at org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:47)
Starting Job = job_201208030244_27387, Tracking URL =
http://hadoop-dev-node01.tendrilinc.com:50030/jobdetails.jsp?jobid=job_201208030244_27387
Kill Command = /usr/lib/hadoop-0.20-mapreduce/bin/hadoop job
-Dmapred.job.tracker=hadoop-dev-node01.tendrilinc.com:8021 -kill
job_201208030244_27387
FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.MapRedTask
FAILED: Hive Internal Error: java.lang.SecurityException(Intercepted
System.exit(9))
java.lang.SecurityException: Intercepted System.exit(9)
	at org.apache.oozie.action.hadoop.LauncherSecurityManager.checkExit(LauncherMapper.java:747)
	at java.lang.Runtime.exit(Runtime.java:88)
	at java.lang.System.exit(System.java:904)
	at org.apache.hadoop.hive.ql.Driver.taskCleanup(Driver.java:1340)
	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1170)
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:951)
	at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:258)
	at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:215)
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:406)
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:341)
	at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:439)
	at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:449)
	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:647)
	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:557)
	at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:303)
	at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:280)
	at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:37)
	at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:55)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:454)
	at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:393)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:327)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:270)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:396)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1232)
	at org.apache.hadoop.mapred.Child.main(Child.java:264)

Intercepting System.exit(12)
Failing Oozie Launcher, Main class
[org.apache.oozie.action.hadoop.HiveMain], exit code [12]

-- 
Dan Richelson, Software Engineer

Tendril
2560 55th St. | Boulder, Colorado 80301
M 303-709-2214
www.tendrilinc.com
 
This email and any files transmitted with it are confidential and intended solely for the use of the individual or entity to whom they are addressed.
If you have received this email in error please notify the sender.
Please note that any views or opinions presented in this email are solely those of the author and do not necessarily represent those of the company.
Finally, the recipient should check this email and any attachments for the presence of viruses.
The company accepts no liability for any damage caused by any virus transmitted by this email.