You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Rajesh Balamohan <ra...@gmail.com> on 2013/10/22 15:53:15 UTC

Hive 12 with Hadoop 2.x with ORC

Hi All,

When running Hive 12 with Hadoop 2.x with ORC, I get the following error
while converting a table with text file to ORC format table.  Any help will
be greatly appreciated

2013-10-22 06:50:49,563 WARN [main]
org.apache.hadoop.mapred.YarnChild: Exception running child :
java.lang.RuntimeException: Hive Runtime Error while closing operators
	at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.close(ExecMapper.java:240)
	at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:429)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:396)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1477)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
Caused by: java.lang.UnsupportedOperationException: This is supposed
to be overridden by subclasses.
	at com.google.protobuf.GeneratedMessage.getUnknownFields(GeneratedMessage.java:180)
	at org.apache.hadoop.hive.ql.io.orc.OrcProto$ColumnStatistics.getSerializedSize(OrcProto.java:3046)
	at com.google.protobuf.CodedOutputStream.computeMessageSizeNoTag(CodedOutputStream.java:749)
	at com.google.protobuf.CodedOutputStream.computeMessageSize(CodedOutputStream.java:530)
	at org.apache.hadoop.hive.ql.io.orc.OrcProto$RowIndexEntry.getSerializedSize(OrcProto.java:4129)
	at com.google.protobuf.CodedOutputStream.computeMessageSizeNoTag(CodedOutputStream.java:749)
	at com.google.protobuf.CodedOutputStream.computeMessageSize(CodedOutputStream.java:530)
	at org.apache.hadoop.hive.ql.io.orc.OrcProto$RowIndex.getSerializedSize(OrcProto.java:4641)
	at com.google.protobuf.AbstractMessageLite.writeTo(AbstractMessageLite.java:75)
	at org.apache.hadoop.hive.ql.io.orc.WriterImpl$TreeWriter.writeStripe(WriterImpl.java:548)
	at org.apache.hadoop.hive.ql.io.orc.WriterImpl$StructTreeWriter.writeStripe(WriterImpl.java:1328)
	at org.apache.hadoop.hive.ql.io.orc.WriterImpl.flushStripe(WriterImpl.java:1699)
	at org.apache.hadoop.hive.ql.io.orc.WriterImpl.close(WriterImpl.java:1868)
	at org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat$OrcRecordWriter.close(OrcOutputFormat.java:95)
	at org.apache.hadoop.hive.ql.exec.FileSinkOperator$FSPaths.closeWriters(FileSinkOperator.java:181)
	at org.apache.hadoop.hive.ql.exec.FileSinkOperator.closeOp(FileSinkOperator.java:866)
	at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:596)
	at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:613)
	at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:613)
	at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:613)
	at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.close(ExecMapper.java:207)
	... 8 more





-- 
~Rajesh.B

Re: Hive 12 with Hadoop 2.x with ORC

Posted by Rajesh Balamohan <ra...@gmail.com>.
Thanks a lot Thejas.  I will try with the patch.


On Wed, Oct 23, 2013 at 2:08 AM, Thejas Nair <th...@hortonworks.com> wrote:

> protobuf 2.5 upgrade did not get included in hive 0.12 (HIVE-5112).
> You might want to apply the protobuf update patch on top of 0.12 to
> use it with recent versions of hadoop 2.x . (but i am certain if this
> is a protobuf version issue).
>
>
> On Tue, Oct 22, 2013 at 6:53 AM, Rajesh Balamohan
> <ra...@gmail.com> wrote:
> > Hi All,
> >
> > When running Hive 12 with Hadoop 2.x with ORC, I get the following error
> > while converting a table with text file to ORC format table.  Any help
> will
> > be greatly appreciated
> >
> > 2013-10-22 06:50:49,563 WARN [main] org.apache.hadoop.mapred.YarnChild:
> > Exception running child : java.lang.RuntimeException: Hive Runtime Error
> > while closing operators
> >       at
> org.apache.hadoop.hive.ql.exec.mr.ExecMapper.close(ExecMapper.java:240)
> >       at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
> >       at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:429)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:396)
> >       at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1477)
> >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
> > Caused by: java.lang.UnsupportedOperationException: This is supposed to
> be
> > overridden by subclasses.
> >       at
> >
> com.google.protobuf.GeneratedMessage.getUnknownFields(GeneratedMessage.java:180)
> >       at
> >
> org.apache.hadoop.hive.ql.io.orc.OrcProto$ColumnStatistics.getSerializedSize(OrcProto.java:3046)
> >       at
> >
> com.google.protobuf.CodedOutputStream.computeMessageSizeNoTag(CodedOutputStream.java:749)
> >       at
> >
> com.google.protobuf.CodedOutputStream.computeMessageSize(CodedOutputStream.java:530)
> >       at
> >
> org.apache.hadoop.hive.ql.io.orc.OrcProto$RowIndexEntry.getSerializedSize(OrcProto.java:4129)
> >       at
> >
> com.google.protobuf.CodedOutputStream.computeMessageSizeNoTag(CodedOutputStream.java:749)
> >       at
> >
> com.google.protobuf.CodedOutputStream.computeMessageSize(CodedOutputStream.java:530)
> >       at
> >
> org.apache.hadoop.hive.ql.io.orc.OrcProto$RowIndex.getSerializedSize(OrcProto.java:4641)
> >       at
> >
> com.google.protobuf.AbstractMessageLite.writeTo(AbstractMessageLite.java:75)
> >       at
> >
> org.apache.hadoop.hive.ql.io.orc.WriterImpl$TreeWriter.writeStripe(WriterImpl.java:548)
> >       at
> >
> org.apache.hadoop.hive.ql.io.orc.WriterImpl$StructTreeWriter.writeStripe(WriterImpl.java:1328)
> >       at
> >
> org.apache.hadoop.hive.ql.io.orc.WriterImpl.flushStripe(WriterImpl.java:1699)
> >       at
> org.apache.hadoop.hive.ql.io.orc.WriterImpl.close(WriterImpl.java:1868)
> >       at
> >
> org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat$OrcRecordWriter.close(OrcOutputFormat.java:95)
> >       at
> >
> org.apache.hadoop.hive.ql.exec.FileSinkOperator$FSPaths.closeWriters(FileSinkOperator.java:181)
> >       at
> >
> org.apache.hadoop.hive.ql.exec.FileSinkOperator.closeOp(FileSinkOperator.java:866)
> >       at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:596)
> >       at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:613)
> >       at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:613)
> >       at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:613)
> >       at
> org.apache.hadoop.hive.ql.exec.mr.ExecMapper.close(ExecMapper.java:207)
> >       ... 8 more
> >
> >
> >
> >
> >
> > --
> > ~Rajesh.B
>
> --
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity to
> which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>



-- 
~Rajesh.B

Re: Hive 12 with Hadoop 2.x with ORC

Posted by Thejas Nair <th...@hortonworks.com>.
protobuf 2.5 upgrade did not get included in hive 0.12 (HIVE-5112).
You might want to apply the protobuf update patch on top of 0.12 to
use it with recent versions of hadoop 2.x . (but i am certain if this
is a protobuf version issue).


On Tue, Oct 22, 2013 at 6:53 AM, Rajesh Balamohan
<ra...@gmail.com> wrote:
> Hi All,
>
> When running Hive 12 with Hadoop 2.x with ORC, I get the following error
> while converting a table with text file to ORC format table.  Any help will
> be greatly appreciated
>
> 2013-10-22 06:50:49,563 WARN [main] org.apache.hadoop.mapred.YarnChild:
> Exception running child : java.lang.RuntimeException: Hive Runtime Error
> while closing operators
> 	at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.close(ExecMapper.java:240)
> 	at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
> 	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:429)
> 	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> 	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:396)
> 	at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1477)
> 	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
> Caused by: java.lang.UnsupportedOperationException: This is supposed to be
> overridden by subclasses.
> 	at
> com.google.protobuf.GeneratedMessage.getUnknownFields(GeneratedMessage.java:180)
> 	at
> org.apache.hadoop.hive.ql.io.orc.OrcProto$ColumnStatistics.getSerializedSize(OrcProto.java:3046)
> 	at
> com.google.protobuf.CodedOutputStream.computeMessageSizeNoTag(CodedOutputStream.java:749)
> 	at
> com.google.protobuf.CodedOutputStream.computeMessageSize(CodedOutputStream.java:530)
> 	at
> org.apache.hadoop.hive.ql.io.orc.OrcProto$RowIndexEntry.getSerializedSize(OrcProto.java:4129)
> 	at
> com.google.protobuf.CodedOutputStream.computeMessageSizeNoTag(CodedOutputStream.java:749)
> 	at
> com.google.protobuf.CodedOutputStream.computeMessageSize(CodedOutputStream.java:530)
> 	at
> org.apache.hadoop.hive.ql.io.orc.OrcProto$RowIndex.getSerializedSize(OrcProto.java:4641)
> 	at
> com.google.protobuf.AbstractMessageLite.writeTo(AbstractMessageLite.java:75)
> 	at
> org.apache.hadoop.hive.ql.io.orc.WriterImpl$TreeWriter.writeStripe(WriterImpl.java:548)
> 	at
> org.apache.hadoop.hive.ql.io.orc.WriterImpl$StructTreeWriter.writeStripe(WriterImpl.java:1328)
> 	at
> org.apache.hadoop.hive.ql.io.orc.WriterImpl.flushStripe(WriterImpl.java:1699)
> 	at org.apache.hadoop.hive.ql.io.orc.WriterImpl.close(WriterImpl.java:1868)
> 	at
> org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat$OrcRecordWriter.close(OrcOutputFormat.java:95)
> 	at
> org.apache.hadoop.hive.ql.exec.FileSinkOperator$FSPaths.closeWriters(FileSinkOperator.java:181)
> 	at
> org.apache.hadoop.hive.ql.exec.FileSinkOperator.closeOp(FileSinkOperator.java:866)
> 	at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:596)
> 	at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:613)
> 	at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:613)
> 	at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:613)
> 	at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.close(ExecMapper.java:207)
> 	... 8 more
>
>
>
>
>
> --
> ~Rajesh.B

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.