You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Abhishek Gayakwad <a....@gmail.com> on 2013/02/25 13:34:28 UTC

FileAlreadyExistsException Parent path is not a directory

I am using Hive 0.9.0, while creating external table

create external table if not exists table1
(
Id int,
Name string,
PName string,
timestamp string
)
row format delimited fields terminated by '\t'
LOCATION '/a/b/c/d/log';


and getting this error

FAILED: Error in metadata: MetaException(message:Got exception:
org.apache.hadoop.fs.FileAlreadyExistsException Parent path is not a
directory: /a/b/c/d/log log
        at
org.apache.hadoop.hdfs.server.namenode.FSDirectory.mkdirs(FSDirectory.java:1485)
        at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:2891)
        at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:2844)
        at
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2823)
        at
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:639)
        at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:417)
        at
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44096)
        at
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:898)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1693)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1689)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1687)
)
FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask


Does any body know the reason why this is happening

Thanks

Re: FileAlreadyExistsException Parent path is not a directory

Posted by Nitin Pawar <ni...@gmail.com>.
do a $HADOOP_HOME/bin/hadoop dfs -ls  /a/b/c/d/log

you should see it as a file

also when you give a location in create table statement... make it a
directory


On Mon, Feb 25, 2013 at 6:04 PM, Abhishek Gayakwad <a....@gmail.com>wrote:

> I am using Hive 0.9.0, while creating external table
>
> create external table if not exists table1
> (
> Id int,
> Name string,
> PName string,
> timestamp string
> )
> row format delimited fields terminated by '\t'
> LOCATION '/a/b/c/d/log';
>
>
> and getting this error
>
> FAILED: Error in metadata: MetaException(message:Got exception:
> org.apache.hadoop.fs.FileAlreadyExistsException Parent path is not a
> directory: /a/b/c/d/log log
>         at
> org.apache.hadoop.hdfs.server.namenode.FSDirectory.mkdirs(FSDirectory.java:1485)
>         at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:2891)
>         at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:2844)
>         at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2823)
>         at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:639)
>         at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:417)
>         at
> org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44096)
>         at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
>         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:898)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1693)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1689)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:396)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1687)
> )
> FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.DDLTask
>
>
> Does any body know the reason why this is happening
>
> Thanks
>



-- 
Nitin Pawar

RE: FileAlreadyExistsException Parent path is not a directory

Posted by Arthur Boender <ar...@avanade.com>.
Is log the file or the directory..

Creating an external table over a directory might me easier, as it would apply to any log file placed into the folder.

If log is the file try

LOCATION '/a/b/c/d/';


If log is a directory try
LOCATION '/a/b/c/d/log/';

Gr Arthur

From: Abhishek Gayakwad [mailto:a.gayakwad@gmail.com]
Sent: maandag 25 februari 2013 13:34
To: user@hive.apache.org
Subject: FileAlreadyExistsException Parent path is not a directory

I am using Hive 0.9.0, while creating external table



create external table if not exists table1

(

Id int,

Name string,

PName string,

timestamp string

)

row format delimited fields terminated by '\t'

LOCATION '/a/b/c/d/log';

and getting this error

FAILED: Error in metadata: MetaException(message:Got exception: org.apache.hadoop.fs.FileAlreadyExistsException Parent path is not a directory: /a/b/c/d/log log
        at org.apache.hadoop.hdfs.server.namenode.FSDirectory.mkdirs(FSDirectory.java:1485)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:2891)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInt(FSNamesystem.java:2844)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2823)
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:639)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:417)
        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:44096)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:453)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:898)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1693)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1689)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1687)
)
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask

Does any body know the reason why this is happening
Thanks