You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Amjad ALSHABANI <as...@gmail.com> on 2014/12/10 11:28:22 UTC

Hive 0.13/Hadoop 0.20 ORC probleme

Hello everybody.

I have a problem when using ORC file format in Hive 0.13. I have built Hive
0.13 with Hadoop 0.20.

when creating a table using ORC format it is OK, but when trying to add
some lines or do a simple count I got this exception:

java.lang.VerifyError: class
org.apache.hadoop.hive.ql.io.orc.OrcProto$RowIndex overrides final method
getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
                at java.lang.ClassLoader.defineClass1(Native Method)
                at java.lang.ClassLoader.defineClass(Unknown Source)
                at java.security.SecureClassLoader.defineClass(Unknown
Source)
                at java.net.URLClassLoader.defineClass(Unknown Source)
                at java.net.URLClassLoader.access$100(Unknown Source)
                at java.net.URLClassLoader$1.run(Unknown Source)
                at java.net.URLClassLoader$1.run(Unknown Source)
                at java.security.AccessController.doPrivileged(Native
Method)
                at java.net.URLClassLoader.findClass(Unknown Source)
                at java.lang.ClassLoader.loadClass(Unknown Source)
                at sun.misc.Launcher$AppClassLoader.loadClass(Unknown
Source)
                at java.lang.ClassLoader.loadClass(Unknown Source)
                at
org.apache.hadoop.hive.ql.io.orc.WriterImpl.<init>(WriterImpl.java:129)
                at
org.apache.hadoop.hive.ql.io.orc.OrcFile.createWriter(OrcFile.java:369)
                at
org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat$OrcRecordWriter.close(OrcOutputFormat.java:103)
                at
org.apache.hadoop.hive.ql.exec.Utilities.createEmptyFile(Utilities.java:3065)
                at
org.apache.hadoop.hive.ql.exec.Utilities.createDummyFileForEmptyPartition(Utilities.java:3089)
                at
org.apache.hadoop.hive.ql.exec.Utilities.getInputPaths(Utilities.java:3013)
                at
org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:369)
                at
org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:136)
                at
org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
                at
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
                at
org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:72)
FAILED: Execution Error, return code -101 from
org.apache.hadoop.hive.ql.exec.mr.MapRedTask. class
org.apache.hadoop.hive.ql.io.orc.OrcProto$RowIndex overrides final method
getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;



I ve checked the code of protobuf used in Hive 0.13, and it is version 2.5
and the method is not defined as final anymore (it s been final in 2.4). I
checked also my CLASSPATTH HADOOP_CLASSPATH. HADOOP_OPTS and none of them
contain the protobuf version 2.4

Any idea how to get rid of this exception

Thanks in advance


Amjad

Re: Hive 0.13/Hadoop 0.20 ORC probleme

Posted by Amjad ALSHABANI <as...@gmail.com>.
Hello ,

Just for the record, I have regenerated the  OrcProto.java using protoc
veriosn 2.4.1, and modified pom.xml to get the 2.4.1 version of
protobuffer, then rebuild all

Now, Hive is working well with the ORC format

Thanx for your help

Amjad

On Fri, Dec 12, 2014 at 2:56 AM, Jander g <ja...@gmail.com> wrote:

> Both Hadoop 2.6 and hive 0.13 use protobuf 2.5, so it runs well.
> But Hadoop 2.0 and before, protobuf 2.4 is used, just as I mentioned
> before, 2.4 and 2.5 are uncompatible.
>
> Good luck.
>
> On Thu, Dec 11, 2014 at 10:22 PM, Amjad ALSHABANI <as...@gmail.com>
> wrote:
>>
>> Hello Jander,
>>
>> Thanx for your reply,
>> I think it is more about hadoop CLASSPATH problem, because the same built
>> Hive 0.13 worked well with Hadoop 2.6
>> I ll try to find a way to change hadoop classpath so it will use the new
>> protobu instead of 2.4
>>
>>
>>
>>
>> On Thu, Dec 11, 2014 at 11:48 AM, Jander g <ja...@gmail.com> wrote:
>>
>>> hi,Amjad
>>>
>>> protobuf 2.4 isn't compatible with protobuf 2.5, so you should regen
>>> OrcProto.java using protobuf 2.4.0, and then rebuild hive.
>>>
>>> I hope it helps.
>>>
>>>
>>>
>>> On Wed, Dec 10, 2014 at 6:28 PM, Amjad ALSHABANI <as...@gmail.com>
>>> wrote:
>>>>
>>>> Hello everybody.
>>>>
>>>> I have a problem when using ORC file format in Hive 0.13. I have built
>>>> Hive 0.13 with Hadoop 0.20.
>>>>
>>>> when creating a table using ORC format it is OK, but when trying to add
>>>> some lines or do a simple count I got this exception:
>>>>
>>>> java.lang.VerifyError: class
>>>> org.apache.hadoop.hive.ql.io.orc.OrcProto$RowIndex overrides final method
>>>> getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
>>>>                 at java.lang.ClassLoader.defineClass1(Native Method)
>>>>                 at java.lang.ClassLoader.defineClass(Unknown Source)
>>>>                 at java.security.SecureClassLoader.defineClass(Unknown
>>>> Source)
>>>>                 at java.net.URLClassLoader.defineClass(Unknown Source)
>>>>                 at java.net.URLClassLoader.access$100(Unknown Source)
>>>>                 at java.net.URLClassLoader$1.run(Unknown Source)
>>>>                 at java.net.URLClassLoader$1.run(Unknown Source)
>>>>                 at java.security.AccessController.doPrivileged(Native
>>>> Method)
>>>>                 at java.net.URLClassLoader.findClass(Unknown Source)
>>>>                 at java.lang.ClassLoader.loadClass(Unknown Source)
>>>>                 at sun.misc.Launcher$AppClassLoader.loadClass(Unknown
>>>> Source)
>>>>                 at java.lang.ClassLoader.loadClass(Unknown Source)
>>>>                 at
>>>> org.apache.hadoop.hive.ql.io.orc.WriterImpl.<init>(WriterImpl.java:129)
>>>>                 at
>>>> org.apache.hadoop.hive.ql.io.orc.OrcFile.createWriter(OrcFile.java:369)
>>>>                 at
>>>> org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat$OrcRecordWriter.close(OrcOutputFormat.java:103)
>>>>                 at
>>>> org.apache.hadoop.hive.ql.exec.Utilities.createEmptyFile(Utilities.java:3065)
>>>>                 at
>>>> org.apache.hadoop.hive.ql.exec.Utilities.createDummyFileForEmptyPartition(Utilities.java:3089)
>>>>                 at
>>>> org.apache.hadoop.hive.ql.exec.Utilities.getInputPaths(Utilities.java:3013)
>>>>                 at
>>>> org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:369)
>>>>                 at
>>>> org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:136)
>>>>                 at
>>>> org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
>>>>                 at
>>>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
>>>>                 at
>>>> org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:72)
>>>> FAILED: Execution Error, return code -101 from
>>>> org.apache.hadoop.hive.ql.exec.mr.MapRedTask. class
>>>> org.apache.hadoop.hive.ql.io.orc.OrcProto$RowIndex overrides final method
>>>> getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
>>>>
>>>>
>>>>
>>>> I ve checked the code of protobuf used in Hive 0.13, and it is version
>>>> 2.5 and the method is not defined as final anymore (it s been final in
>>>> 2.4). I checked also my CLASSPATTH HADOOP_CLASSPATH. HADOOP_OPTS and none
>>>> of them contain the protobuf version 2.4
>>>>
>>>> Any idea how to get rid of this exception
>>>>
>>>> Thanks in advance
>>>>
>>>>
>>>> Amjad
>>>>
>>>>
>>>>
>>>
>>> --
>>> Thanks,
>>> Jander
>>>
>>
>>
>
> --
> Thanks,
> Jander
>

Re: Hive 0.13/Hadoop 0.20 ORC probleme

Posted by Jander g <ja...@gmail.com>.
Both Hadoop 2.6 and hive 0.13 use protobuf 2.5, so it runs well.
But Hadoop 2.0 and before, protobuf 2.4 is used, just as I mentioned
before, 2.4 and 2.5 are uncompatible.

Good luck.

On Thu, Dec 11, 2014 at 10:22 PM, Amjad ALSHABANI <as...@gmail.com>
wrote:
>
> Hello Jander,
>
> Thanx for your reply,
> I think it is more about hadoop CLASSPATH problem, because the same built
> Hive 0.13 worked well with Hadoop 2.6
> I ll try to find a way to change hadoop classpath so it will use the new
> protobu instead of 2.4
>
>
>
>
> On Thu, Dec 11, 2014 at 11:48 AM, Jander g <ja...@gmail.com> wrote:
>
>> hi,Amjad
>>
>> protobuf 2.4 isn't compatible with protobuf 2.5, so you should regen
>> OrcProto.java using protobuf 2.4.0, and then rebuild hive.
>>
>> I hope it helps.
>>
>>
>>
>> On Wed, Dec 10, 2014 at 6:28 PM, Amjad ALSHABANI <as...@gmail.com>
>> wrote:
>>>
>>> Hello everybody.
>>>
>>> I have a problem when using ORC file format in Hive 0.13. I have built
>>> Hive 0.13 with Hadoop 0.20.
>>>
>>> when creating a table using ORC format it is OK, but when trying to add
>>> some lines or do a simple count I got this exception:
>>>
>>> java.lang.VerifyError: class
>>> org.apache.hadoop.hive.ql.io.orc.OrcProto$RowIndex overrides final method
>>> getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
>>>                 at java.lang.ClassLoader.defineClass1(Native Method)
>>>                 at java.lang.ClassLoader.defineClass(Unknown Source)
>>>                 at java.security.SecureClassLoader.defineClass(Unknown
>>> Source)
>>>                 at java.net.URLClassLoader.defineClass(Unknown Source)
>>>                 at java.net.URLClassLoader.access$100(Unknown Source)
>>>                 at java.net.URLClassLoader$1.run(Unknown Source)
>>>                 at java.net.URLClassLoader$1.run(Unknown Source)
>>>                 at java.security.AccessController.doPrivileged(Native
>>> Method)
>>>                 at java.net.URLClassLoader.findClass(Unknown Source)
>>>                 at java.lang.ClassLoader.loadClass(Unknown Source)
>>>                 at sun.misc.Launcher$AppClassLoader.loadClass(Unknown
>>> Source)
>>>                 at java.lang.ClassLoader.loadClass(Unknown Source)
>>>                 at
>>> org.apache.hadoop.hive.ql.io.orc.WriterImpl.<init>(WriterImpl.java:129)
>>>                 at
>>> org.apache.hadoop.hive.ql.io.orc.OrcFile.createWriter(OrcFile.java:369)
>>>                 at
>>> org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat$OrcRecordWriter.close(OrcOutputFormat.java:103)
>>>                 at
>>> org.apache.hadoop.hive.ql.exec.Utilities.createEmptyFile(Utilities.java:3065)
>>>                 at
>>> org.apache.hadoop.hive.ql.exec.Utilities.createDummyFileForEmptyPartition(Utilities.java:3089)
>>>                 at
>>> org.apache.hadoop.hive.ql.exec.Utilities.getInputPaths(Utilities.java:3013)
>>>                 at
>>> org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:369)
>>>                 at
>>> org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:136)
>>>                 at
>>> org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
>>>                 at
>>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
>>>                 at
>>> org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:72)
>>> FAILED: Execution Error, return code -101 from
>>> org.apache.hadoop.hive.ql.exec.mr.MapRedTask. class
>>> org.apache.hadoop.hive.ql.io.orc.OrcProto$RowIndex overrides final method
>>> getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
>>>
>>>
>>>
>>> I ve checked the code of protobuf used in Hive 0.13, and it is version
>>> 2.5 and the method is not defined as final anymore (it s been final in
>>> 2.4). I checked also my CLASSPATTH HADOOP_CLASSPATH. HADOOP_OPTS and none
>>> of them contain the protobuf version 2.4
>>>
>>> Any idea how to get rid of this exception
>>>
>>> Thanks in advance
>>>
>>>
>>> Amjad
>>>
>>>
>>>
>>
>> --
>> Thanks,
>> Jander
>>
>
>

-- 
Thanks,
Jander

Re: Hive 0.13/Hadoop 0.20 ORC probleme

Posted by Amjad ALSHABANI <as...@gmail.com>.
Hello Jander,

Thanx for your reply,
I think it is more about hadoop CLASSPATH problem, because the same built
Hive 0.13 worked well with Hadoop 2.6
I ll try to find a way to change hadoop classpath so it will use the new
protobu instead of 2.4




On Thu, Dec 11, 2014 at 11:48 AM, Jander g <ja...@gmail.com> wrote:

> hi,Amjad
>
> protobuf 2.4 isn't compatible with protobuf 2.5, so you should regen
> OrcProto.java using protobuf 2.4.0, and then rebuild hive.
>
> I hope it helps.
>
>
>
> On Wed, Dec 10, 2014 at 6:28 PM, Amjad ALSHABANI <as...@gmail.com>
> wrote:
>>
>> Hello everybody.
>>
>> I have a problem when using ORC file format in Hive 0.13. I have built
>> Hive 0.13 with Hadoop 0.20.
>>
>> when creating a table using ORC format it is OK, but when trying to add
>> some lines or do a simple count I got this exception:
>>
>> java.lang.VerifyError: class
>> org.apache.hadoop.hive.ql.io.orc.OrcProto$RowIndex overrides final method
>> getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
>>                 at java.lang.ClassLoader.defineClass1(Native Method)
>>                 at java.lang.ClassLoader.defineClass(Unknown Source)
>>                 at java.security.SecureClassLoader.defineClass(Unknown
>> Source)
>>                 at java.net.URLClassLoader.defineClass(Unknown Source)
>>                 at java.net.URLClassLoader.access$100(Unknown Source)
>>                 at java.net.URLClassLoader$1.run(Unknown Source)
>>                 at java.net.URLClassLoader$1.run(Unknown Source)
>>                 at java.security.AccessController.doPrivileged(Native
>> Method)
>>                 at java.net.URLClassLoader.findClass(Unknown Source)
>>                 at java.lang.ClassLoader.loadClass(Unknown Source)
>>                 at sun.misc.Launcher$AppClassLoader.loadClass(Unknown
>> Source)
>>                 at java.lang.ClassLoader.loadClass(Unknown Source)
>>                 at
>> org.apache.hadoop.hive.ql.io.orc.WriterImpl.<init>(WriterImpl.java:129)
>>                 at
>> org.apache.hadoop.hive.ql.io.orc.OrcFile.createWriter(OrcFile.java:369)
>>                 at
>> org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat$OrcRecordWriter.close(OrcOutputFormat.java:103)
>>                 at
>> org.apache.hadoop.hive.ql.exec.Utilities.createEmptyFile(Utilities.java:3065)
>>                 at
>> org.apache.hadoop.hive.ql.exec.Utilities.createDummyFileForEmptyPartition(Utilities.java:3089)
>>                 at
>> org.apache.hadoop.hive.ql.exec.Utilities.getInputPaths(Utilities.java:3013)
>>                 at
>> org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:369)
>>                 at
>> org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:136)
>>                 at
>> org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
>>                 at
>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
>>                 at
>> org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:72)
>> FAILED: Execution Error, return code -101 from
>> org.apache.hadoop.hive.ql.exec.mr.MapRedTask. class
>> org.apache.hadoop.hive.ql.io.orc.OrcProto$RowIndex overrides final method
>> getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
>>
>>
>>
>> I ve checked the code of protobuf used in Hive 0.13, and it is version
>> 2.5 and the method is not defined as final anymore (it s been final in
>> 2.4). I checked also my CLASSPATTH HADOOP_CLASSPATH. HADOOP_OPTS and none
>> of them contain the protobuf version 2.4
>>
>> Any idea how to get rid of this exception
>>
>> Thanks in advance
>>
>>
>> Amjad
>>
>>
>>
>
> --
> Thanks,
> Jander
>

Re: Hive 0.13/Hadoop 0.20 ORC probleme

Posted by Jander g <ja...@gmail.com>.
hi,Amjad

protobuf 2.4 isn't compatible with protobuf 2.5, so you should regen
OrcProto.java using protobuf 2.4.0, and then rebuild hive.

I hope it helps.



On Wed, Dec 10, 2014 at 6:28 PM, Amjad ALSHABANI <as...@gmail.com>
wrote:
>
> Hello everybody.
>
> I have a problem when using ORC file format in Hive 0.13. I have built
> Hive 0.13 with Hadoop 0.20.
>
> when creating a table using ORC format it is OK, but when trying to add
> some lines or do a simple count I got this exception:
>
> java.lang.VerifyError: class
> org.apache.hadoop.hive.ql.io.orc.OrcProto$RowIndex overrides final method
> getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
>                 at java.lang.ClassLoader.defineClass1(Native Method)
>                 at java.lang.ClassLoader.defineClass(Unknown Source)
>                 at java.security.SecureClassLoader.defineClass(Unknown
> Source)
>                 at java.net.URLClassLoader.defineClass(Unknown Source)
>                 at java.net.URLClassLoader.access$100(Unknown Source)
>                 at java.net.URLClassLoader$1.run(Unknown Source)
>                 at java.net.URLClassLoader$1.run(Unknown Source)
>                 at java.security.AccessController.doPrivileged(Native
> Method)
>                 at java.net.URLClassLoader.findClass(Unknown Source)
>                 at java.lang.ClassLoader.loadClass(Unknown Source)
>                 at sun.misc.Launcher$AppClassLoader.loadClass(Unknown
> Source)
>                 at java.lang.ClassLoader.loadClass(Unknown Source)
>                 at
> org.apache.hadoop.hive.ql.io.orc.WriterImpl.<init>(WriterImpl.java:129)
>                 at
> org.apache.hadoop.hive.ql.io.orc.OrcFile.createWriter(OrcFile.java:369)
>                 at
> org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat$OrcRecordWriter.close(OrcOutputFormat.java:103)
>                 at
> org.apache.hadoop.hive.ql.exec.Utilities.createEmptyFile(Utilities.java:3065)
>                 at
> org.apache.hadoop.hive.ql.exec.Utilities.createDummyFileForEmptyPartition(Utilities.java:3089)
>                 at
> org.apache.hadoop.hive.ql.exec.Utilities.getInputPaths(Utilities.java:3013)
>                 at
> org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:369)
>                 at
> org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:136)
>                 at
> org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
>                 at
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
>                 at
> org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:72)
> FAILED: Execution Error, return code -101 from
> org.apache.hadoop.hive.ql.exec.mr.MapRedTask. class
> org.apache.hadoop.hive.ql.io.orc.OrcProto$RowIndex overrides final method
> getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
>
>
>
> I ve checked the code of protobuf used in Hive 0.13, and it is version 2.5
> and the method is not defined as final anymore (it s been final in 2.4). I
> checked also my CLASSPATTH HADOOP_CLASSPATH. HADOOP_OPTS and none of them
> contain the protobuf version 2.4
>
> Any idea how to get rid of this exception
>
> Thanks in advance
>
>
> Amjad
>
>
>

-- 
Thanks,
Jander