You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Ranjitha Chandrashekar <Ra...@hcl.com> on 2013/04/04 07:43:56 UTC

External Table to Sequence File on HDFS

Hi



I want to create a external hive table to a sequence file(each record -> key value) on HDFS. How will the field names be mapped to the column names.



Please Suggest.



Thanks

Ranjitha.



::DISCLAIMER::
----------------------------------------------------------------------------------------------------------------------------------------------------

The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
E-mail transmission is not guaranteed to be secure or error-free as information could be intercepted, corrupted,
lost, destroyed, arrive late or incomplete, or may contain viruses in transmission. The e mail and its contents
(with or without referred errors) shall therefore not attach any liability on the originator or HCL or its affiliates.
Views or opinions, if any, presented in this email are solely those of the author and may not necessarily reflect the
views or opinions of HCL or its affiliates. Any form of reproduction, dissemination, copying, disclosure, modification,
distribution and / or publication of this message without the prior written consent of authorized representative of
HCL is strictly prohibited. If you have received this email in error please delete it and notify the sender immediately.
Before opening any email and/or attachments, please check them for viruses and other defects.

----------------------------------------------------------------------------------------------------------------------------------------------------

RE: External Table to Sequence File on HDFS

Posted by Ranjitha Chandrashekar <Ra...@hcl.com>.
Hi Sanjay

Thank you for the quick response.

I got the input format part from the link that u sent. But in order to read that table in Hive, i need to specify the SerDe, where exactly do I specify this class file.

Is it something like,

create table seq10 (key STRING, value STRING) ROW FORMAT SERDE 'com.org.SequenceFileKeyRecordReader' STORED AS INPUTFORMAT 'com.org.SequenceFileKeyInputFormat' OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.IgnoreKeyTextOutputFormat' LOCATION <hdfs path>;

Is this the right approach where i specify the input file format and write a custom serDe too..?

Please correct me if I am wrong

Thanks
Ranjitha.

From: Sanjay Subramanian [mailto:Sanjay.Subramanian@wizecommerce.com]
Sent: 04 April 2013 11:41
To: user@hive.apache.org
Subject: Re: External Table to Sequence File on HDFS

Check this out
http://stackoverflow.com/questions/13203770/reading-hadoop-sequencefiles-with-hive

From: Ranjitha Chandrashekar <Ra...@hcl.com>>
Reply-To: "user@hive.apache.org<ma...@hive.apache.org>" <us...@hive.apache.org>>
Date: Wednesday, April 3, 2013 10:43 PM
To: "user@hive.apache.org<ma...@hive.apache.org>" <us...@hive.apache.org>>
Subject: External Table to Sequence File on HDFS


Hi



I want to create a external hive table to a sequence file(each record -> key value) on HDFS. How will the field names be mapped to the column names.



Please Suggest.



Thanks

Ranjitha.



::DISCLAIMER::
----------------------------------------------------------------------------------------------------------------------------------------------------
The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
E-mail transmission is not guaranteed to be secure or error-free as information could be intercepted, corrupted,
lost, destroyed, arrive late or incomplete, or may contain viruses in transmission. The e mail and its contents
(with or without referred errors) shall therefore not attach any liability on the originator or HCL or its affiliates.
Views or opinions, if any, presented in this email are solely those of the author and may not necessarily reflect the
views or opinions of HCL or its affiliates. Any form of reproduction, dissemination, copying, disclosure, modification,
distribution and / or publication of this message without the prior written consent of authorized representative of
HCL is strictly prohibited. If you have received this email in error please delete it and notify the sender immediately.
Before opening any email and/or attachments, please check them for viruses and other defects.
----------------------------------------------------------------------------------------------------------------------------------------------------

CONFIDENTIALITY NOTICE
======================
This email message and any attachments are for the exclusive use of the intended recipient(s) and may contain confidential and privileged information. Any unauthorized review, use, disclosure or distribution is prohibited. If you are not the intended recipient, please contact the sender by reply email and destroy all copies of the original message along with any attachments, from your computer system. If you are the intended recipient, please be advised that the content of this message is subject to access, review and disclosure by the sender's Email System Administrator.

RE: Error Creating External Table

Posted by Ranjitha Chandrashekar <Ra...@hcl.com>.
Hi Piyush

That was a problem with the path. There were other incompatible files in that directory.

Thanks anyway.. :)

From: Piyush Srivastava [mailto:Piyush.Srivastava@wizecommerce.com]
Sent: 05 April 2013 15:23
To: user@hive.apache.org
Subject: RE: Error Creating External Table

When you giving location give it as '/user/myfolder/items' hive know it is need to be store at HDFS which is define at $HADOOP_CONF_DIR/hdfs-site.xml.

Thanks,
./Piyush
________________________________
From: Ranjitha Chandrashekar [Ranjitha.Ch@hcl.com]
Sent: Friday, April 05, 2013 3:16 PM
To: user@hive.apache.org
Subject: Error Creating External Table
Hi

When i try creating a external table to a text file on HDFS i get the following error. Could someone please let me know where I am going wrong.

 hive> create external table seq4 (item STRING) row format delimited fields terminated by '' STORED as TEXTFILE location 'hdfs://<host>:54310/user/myfolder/items';
FAILED: Error in metadata: MetaException(message:Got exception: org.apache.hadoop.ipc.RemoteException java.io.FileNotFoundException: Parent path is not a directory: /user/myfolder/items
        at org.apache.hadoop.hdfs.server.namenode.FSDirectory.mkdirs(FSDirectory.java:944)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:2068)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2029)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.mkdirs(NameNode.java:817)
        at sun.reflect.GeneratedMethodAccessor109.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:616)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:416)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
)
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask

Thanks
Ranjitha.


::DISCLAIMER::
----------------------------------------------------------------------------------------------------------------------------------------------------
The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
E-mail transmission is not guaranteed to be secure or error-free as information could be intercepted, corrupted,
lost, destroyed, arrive late or incomplete, or may contain viruses in transmission. The e mail and its contents
(with or without referred errors) shall therefore not attach any liability on the originator or HCL or its affiliates.
Views or opinions, if any, presented in this email are solely those of the author and may not necessarily reflect the
views or opinions of HCL or its affiliates. Any form of reproduction, dissemination, copying, disclosure, modification,
distribution and / or publication of this message without the prior written consent of authorized representative of
HCL is strictly prohibited. If you have received this email in error please delete it and notify the sender immediately.
Before opening any email and/or attachments, please check them for viruses and other defects.
----------------------------------------------------------------------------------------------------------------------------------------------------

CONFIDENTIALITY NOTICE
======================
This email message and any attachments are for the exclusive use of the intended recipient(s) and may contain confidential and privileged information. Any unauthorized review, use, disclosure or distribution is prohibited. If you are not the intended recipient, please contact the sender by reply email and destroy all copies of the original message along with any attachments, from your computer system. If you are the intended recipient, please be advised that the content of this message is subject to access, review and disclosure by the sender's Email System Administrator.

RE: Error Creating External Table

Posted by Piyush Srivastava <Pi...@wizecommerce.com>.
When you giving location give it as '/user/myfolder/items' hive know it is need to be store at HDFS which is define at $HADOOP_CONF_DIR/hdfs-site.xml.

Thanks,
./Piyush
________________________________
From: Ranjitha Chandrashekar [Ranjitha.Ch@hcl.com]
Sent: Friday, April 05, 2013 3:16 PM
To: user@hive.apache.org
Subject: Error Creating External Table

Hi

When i try creating a external table to a text file on HDFS i get the following error. Could someone please let me know where I am going wrong.

 hive> create external table seq4 (item STRING) row format delimited fields terminated by '' STORED as TEXTFILE location 'hdfs://<host>:54310/user/myfolder/items';
FAILED: Error in metadata: MetaException(message:Got exception: org.apache.hadoop.ipc.RemoteException java.io.FileNotFoundException: Parent path is not a directory: /user/myfolder/items
        at org.apache.hadoop.hdfs.server.namenode.FSDirectory.mkdirs(FSDirectory.java:944)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:2068)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2029)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.mkdirs(NameNode.java:817)
        at sun.reflect.GeneratedMethodAccessor109.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:616)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:416)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
)
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask

Thanks
Ranjitha.


::DISCLAIMER::
----------------------------------------------------------------------------------------------------------------------------------------------------
The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
E-mail transmission is not guaranteed to be secure or error-free as information could be intercepted, corrupted,
lost, destroyed, arrive late or incomplete, or may contain viruses in transmission. The e mail and its contents
(with or without referred errors) shall therefore not attach any liability on the originator or HCL or its affiliates.
Views or opinions, if any, presented in this email are solely those of the author and may not necessarily reflect the
views or opinions of HCL or its affiliates. Any form of reproduction, dissemination, copying, disclosure, modification,
distribution and / or publication of this message without the prior written consent of authorized representative of
HCL is strictly prohibited. If you have received this email in error please delete it and notify the sender immediately.
Before opening any email and/or attachments, please check them for viruses and other defects.
----------------------------------------------------------------------------------------------------------------------------------------------------

CONFIDENTIALITY NOTICE
======================
This email message and any attachments are for the exclusive use of the intended recipient(s) and may contain confidential and privileged information. Any unauthorized review, use, disclosure or distribution is prohibited. If you are not the intended recipient, please contact the sender by reply email and destroy all copies of the original message along with any attachments, from your computer system. If you are the intended recipient, please be advised that the content of this message is subject to access, review and disclosure by the sender's Email System Administrator.

Error Creating External Table

Posted by Ranjitha Chandrashekar <Ra...@hcl.com>.
Hi

When i try creating a external table to a text file on HDFS i get the following error. Could someone please let me know where I am going wrong.

 hive> create external table seq4 (item STRING) row format delimited fields terminated by '' STORED as TEXTFILE location 'hdfs://<host>:54310/user/myfolder/items';
FAILED: Error in metadata: MetaException(message:Got exception: org.apache.hadoop.ipc.RemoteException java.io.FileNotFoundException: Parent path is not a directory: /user/myfolder/items
        at org.apache.hadoop.hdfs.server.namenode.FSDirectory.mkdirs(FSDirectory.java:944)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirsInternal(FSNamesystem.java:2068)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:2029)
        at org.apache.hadoop.hdfs.server.namenode.NameNode.mkdirs(NameNode.java:817)
        at sun.reflect.GeneratedMethodAccessor109.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:616)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:563)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1388)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1384)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:416)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1093)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1382)
)
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask

Thanks
Ranjitha.


::DISCLAIMER::
----------------------------------------------------------------------------------------------------------------------------------------------------

The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
E-mail transmission is not guaranteed to be secure or error-free as information could be intercepted, corrupted,
lost, destroyed, arrive late or incomplete, or may contain viruses in transmission. The e mail and its contents
(with or without referred errors) shall therefore not attach any liability on the originator or HCL or its affiliates.
Views or opinions, if any, presented in this email are solely those of the author and may not necessarily reflect the
views or opinions of HCL or its affiliates. Any form of reproduction, dissemination, copying, disclosure, modification,
distribution and / or publication of this message without the prior written consent of authorized representative of
HCL is strictly prohibited. If you have received this email in error please delete it and notify the sender immediately.
Before opening any email and/or attachments, please check them for viruses and other defects.

----------------------------------------------------------------------------------------------------------------------------------------------------

Re: External Table to Sequence File on HDFS

Posted by Sanjay Subramanian <Sa...@wizecommerce.com>.
Check this out
http://stackoverflow.com/questions/13203770/reading-hadoop-sequencefiles-with-hive

From: Ranjitha Chandrashekar <Ra...@hcl.com>>
Reply-To: "user@hive.apache.org<ma...@hive.apache.org>" <us...@hive.apache.org>>
Date: Wednesday, April 3, 2013 10:43 PM
To: "user@hive.apache.org<ma...@hive.apache.org>" <us...@hive.apache.org>>
Subject: External Table to Sequence File on HDFS


Hi



I want to create a external hive table to a sequence file(each record -> key value) on HDFS. How will the field names be mapped to the column names.



Please Suggest.



Thanks

Ranjitha.



::DISCLAIMER::
----------------------------------------------------------------------------------------------------------------------------------------------------
The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
E-mail transmission is not guaranteed to be secure or error-free as information could be intercepted, corrupted,
lost, destroyed, arrive late or incomplete, or may contain viruses in transmission. The e mail and its contents
(with or without referred errors) shall therefore not attach any liability on the originator or HCL or its affiliates.
Views or opinions, if any, presented in this email are solely those of the author and may not necessarily reflect the
views or opinions of HCL or its affiliates. Any form of reproduction, dissemination, copying, disclosure, modification,
distribution and / or publication of this message without the prior written consent of authorized representative of
HCL is strictly prohibited. If you have received this email in error please delete it and notify the sender immediately.
Before opening any email and/or attachments, please check them for viruses and other defects.
----------------------------------------------------------------------------------------------------------------------------------------------------

CONFIDENTIALITY NOTICE
======================
This email message and any attachments are for the exclusive use of the intended recipient(s) and may contain confidential and privileged information. Any unauthorized review, use, disclosure or distribution is prohibited. If you are not the intended recipient, please contact the sender by reply email and destroy all copies of the original message along with any attachments, from your computer system. If you are the intended recipient, please be advised that the content of this message is subject to access, review and disclosure by the sender's Email System Administrator.