You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by "bhsc.happy" <bh...@163.com> on 2014/01/09 04:45:52 UTC

write orcfile exception

write orcfile with compress CompressionKind.ZLIB or CompressionKind.SNAPPY occur exception:(CompressionKind.NONE is ok)
Exception in thread "main" java.lang.IndexOutOfBoundsException
        at java.nio.ByteBuffer.wrap(ByteBuffer.java:352)
        at org.apache.hadoop.hive.ql.io.orc.InStream$CompressedStream.readHeader(InStream.java:173)
        at org.apache.hadoop.hive.ql.io.orc.InStream$CompressedStream.read(InStream.java:188)
        at org.apache.hadoop.hive.ql.io.orc.SerializationUtils.readInts(SerializationUtils.java:450)
        at org.apache.hadoop.hive.ql.io.orc.RunLengthIntegerReaderV2.readDirectValues(RunLengthIntegerReaderV2.java:239)
        at org.apache.hadoop.hive.ql.io.orc.RunLengthIntegerReaderV2.readValues(RunLengthIntegerReaderV2.java:52)
        at org.apache.hadoop.hive.ql.io.orc.RunLengthIntegerReaderV2.next(RunLengthIntegerReaderV2.java:287)
        at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl$StringDictionaryTreeReader.next(RecordReaderImpl.java:978)
        at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl$StringTreeReader.next(RecordReaderImpl.java:821)
        at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl$StructTreeReader.next(RecordReaderImpl.java:1089)
        at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.next(RecordReaderImpl.java:2093)

it is difficult to debug,anyone can help?




bhsc.happy

回复: Re: write orcfile exception

Posted by "bhsc.happy" <bh...@163.com>.
when i use company business data run into this exception,data it's hugu,my test data is ok

so i can not provide a test data。。。i use https://github.com/apache/hive/tree/branch-0.12 




bhsc.happy

发件人: Prasanth Jayachandran
发送时间: 2014-01-09 14:49
收件人: user
主题: Re: write orcfile exception
Does it happen with trunk or any specific version of hive? Can you provide a test data that reproduces this issue?


Thanks
Prasanth Jayachandran


On Jan 9, 2014, at 9:15 AM, bhsc.happy <bh...@163.com> wrote:


write orcfile with compress CompressionKind.ZLIB or CompressionKind.SNAPPY occur exception:(CompressionKind.NONE is ok)
Exception in thread "main" java.lang.IndexOutOfBoundsException
        at java.nio.ByteBuffer.wrap(ByteBuffer.java:352)
        at org.apache.hadoop.hive.ql.io.orc.InStream$CompressedStream.readHeader(InStream.java:173)
        at org.apache.hadoop.hive.ql.io.orc.InStream$CompressedStream.read(InStream.java:188)
        at org.apache.hadoop.hive.ql.io.orc.SerializationUtils.readInts(SerializationUtils.java:450)
        at org.apache.hadoop.hive.ql.io.orc.RunLengthIntegerReaderV2.readDirectValues(RunLengthIntegerReaderV2.java:239)
        at org.apache.hadoop.hive.ql.io.orc.RunLengthIntegerReaderV2.readValues(RunLengthIntegerReaderV2.java:52)
        at org.apache.hadoop.hive.ql.io.orc.RunLengthIntegerReaderV2.next(RunLengthIntegerReaderV2.java:287)
        at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl$StringDictionaryTreeReader.next(RecordReaderImpl.java:978)
        at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl$StringTreeReader.next(RecordReaderImpl.java:821)
        at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl$StructTreeReader.next(RecordReaderImpl.java:1089)
        at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.next(RecordReaderImpl.java:2093)

it is difficult to debug,anyone can help?




bhsc.happy



CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You. 

Re: write orcfile exception

Posted by Prasanth Jayachandran <pj...@hortonworks.com>.
Does it happen with trunk or any specific version of hive? Can you provide a test data that reproduces this issue?

Thanks
Prasanth Jayachandran

On Jan 9, 2014, at 9:15 AM, bhsc.happy <bh...@163.com> wrote:

> write orcfile with compress CompressionKind.ZLIB or CompressionKind.SNAPPY occur exception:(CompressionKind.NONE is ok)
> Exception in thread "main" java.lang.IndexOutOfBoundsException
>         at java.nio.ByteBuffer.wrap(ByteBuffer.java:352)
>         at org.apache.hadoop.hive.ql.io.orc.InStream$CompressedStream.readHeader(InStream.java:173)
>         at org.apache.hadoop.hive.ql.io.orc.InStream$CompressedStream.read(InStream.java:188)
>         at org.apache.hadoop.hive.ql.io.orc.SerializationUtils.readInts(SerializationUtils.java:450)
>         at org.apache.hadoop.hive.ql.io.orc.RunLengthIntegerReaderV2.readDirectValues(RunLengthIntegerReaderV2.java:239)
>         at org.apache.hadoop.hive.ql.io.orc.RunLengthIntegerReaderV2.readValues(RunLengthIntegerReaderV2.java:52)
>         at org.apache.hadoop.hive.ql.io.orc.RunLengthIntegerReaderV2.next(RunLengthIntegerReaderV2.java:287)
>         at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl$StringDictionaryTreeReader.next(RecordReaderImpl.java:978)
>         at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl$StringTreeReader.next(RecordReaderImpl.java:821)
>         at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl$StructTreeReader.next(RecordReaderImpl.java:1089)
>         at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.next(RecordReaderImpl.java:2093)
>  
> it is difficult to debug,anyone can help?
>  
> bhsc.happy


-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.