You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by kalai selvi <ka...@gmail.com> on 2016/03/22 17:05:12 UTC

Using Hive SerDe dependent on Protobuf 2.6

Hi,

I am using Hive 0.13 in Amazon EMR. I am stuck up with the problem of
hive-exec jar being bundled with older protobuf buffers java of version
2.5. Please help me in getting myself unblocked from this problem.

We have developed a Cusom SerDe which is in turn dependent on
protobuf-buffers-java package of version 2.6 and the jar of this SerDe is
built by bundling the protobuf-buffers-java of version 2.6.

When I try to create a table using my custom serde, it fails in parsing the
data with the following error. This is because the method that is used from
protobuf in my custom serde is not available in protobuf version 2.5. Even
though i have bundled protobuf of version 2.6 with my custom serde, it does
not seem to use that rather uses the protobuf of version 2.5 that comes
bundled with hive-exec jar. I tried adding the protobuf 2.6 jar explicitly
on hive console and that does not help. How do i make my serde use the
protobuf version bundled within my serde?

*Exception in thread "main" java.lang.NoSuchMethodError:
com.google.protobuf.LazyStringList.getUnmodifiableView()Lcom/google/protobuf/LazyStringList;*

* at
com.amazon.transportation.ate.aas.AddressAttributeTypes$GateCodeAttribute.<init>(AddressAttributeTypes.java:3451)*

* at
com.amazon.transportation.ate.aas.AddressAttributeTypes$GateCodeAttribute.<init>(AddressAttributeTypes.java:3384)*

* at
com.amazon.transportation.ate.aas.AddressAttributeTypes$GateCodeAttribute$1.parsePartialFrom(AddressAttributeTypes.java:3475)*

* at
com.amazon.transportation.ate.aas.AddressAttributeTypes$GateCodeAttribute$1.parsePartialFrom(AddressAttributeTypes.java:3470)*

* at
com.google.protobuf.CodedInputStream.readMessage(CodedInputStream.java:309)*

* at
com.amazon.transportation.ate.aas.AddressAttributeTypes$Attribute.<init>(AddressAttributeTypes.java:961)*

* at
com.amazon.transportation.ate.aas.AddressAttributeTypes$Attribute.<init>(AddressAttributeTypes.java:862)*

* at
com.amazon.transportation.ate.aas.AddressAttributeTypes$Attribute$1.parsePartialFrom(AddressAttributeTypes.java:1051)*

* at
com.amazon.transportation.ate.aas.AddressAttributeTypes$Attribute$1.parsePartialFrom(AddressAttributeTypes.java:1046)*

* at
com.google.protobuf.CodedInputStream.readMessage(CodedInputStream.java:309)*

* at
com.amazon.transportation.ate.aas.AtePlacesExchanges$AteAddressAttributeNotification.<init>(AtePlacesExchanges.java:883)*

* at
com.amazon.transportation.ate.aas.AtePlacesExchanges$AteAddressAttributeNotification.<init>(AtePlacesExchanges.java:811)*

* at
com.amazon.transportation.ate.aas.AtePlacesExchanges$AteAddressAttributeNotification$1.parsePartialFrom(AtePlacesExchanges.java:919)*

* at
com.amazon.transportation.ate.aas.AtePlacesExchanges$AteAddressAttributeNotification$1.parsePartialFrom(AtePlacesExchanges.java:914)*

* at
com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:141)*

* at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:176)*

* at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:188)*

* at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:193)*

* at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:49)*

* at
com.amazon.transportation.ate.aas.AtePlacesExchanges$AteAddressAttributeNotification.parseFrom(AtePlacesExchanges.java:1134)*

* at
com.amazon.places.protobuf.decoder.NotificationsProtoDecoder.<init>(NotificationsProtoDecoder.java:57)*

* at
com.amazon.places.serde.AtePlacesNotificationSerde.deserialize(AtePlacesNotificationSerde.java:103)*

* at
org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:620)*

* at
org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:534)*

* at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:137)*

* at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1519)*

* at
org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:292)*

* at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:227)*

* at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:430)*

* at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:803)*

* at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:697)*

* at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:636)*

* at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)*

* at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)*

* at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)*

* at java.lang.reflect.Method.invoke(Method.java:606)*

* at org.apache.hadoop.util.RunJar.main(RunJar.java:212)*


Please help me with the solution to my problem.


Thanks,

Kalai

Re: Using Hive SerDe dependent on Protobuf 2.6

Posted by Edward Capriolo <ed...@gmail.com>.
Both thfrift and protbuf are wire compatible but NOT classpath compatible,
you need to make sure that you are using one version (even down to the
minor version) across all your codebase.

On Tue, Mar 22, 2016 at 12:05 PM, kalai selvi <ka...@gmail.com> wrote:

> Hi,
>
> I am using Hive 0.13 in Amazon EMR. I am stuck up with the problem of
> hive-exec jar being bundled with older protobuf buffers java of version
> 2.5. Please help me in getting myself unblocked from this problem.
>
> We have developed a Cusom SerDe which is in turn dependent on
> protobuf-buffers-java package of version 2.6 and the jar of this SerDe is
> built by bundling the protobuf-buffers-java of version 2.6.
>
> When I try to create a table using my custom serde, it fails in parsing
> the data with the following error. This is because the method that is used
> from protobuf in my custom serde is not available in protobuf version 2.5.
> Even though i have bundled protobuf of version 2.6 with my custom serde, it
> does not seem to use that rather uses the protobuf of version 2.5 that
> comes bundled with hive-exec jar. I tried adding the protobuf 2.6 jar
> explicitly on hive console and that does not help. How do i make my serde
> use the protobuf version bundled within my serde?
>
> *Exception in thread "main" java.lang.NoSuchMethodError:
> com.google.protobuf.LazyStringList.getUnmodifiableView()Lcom/google/protobuf/LazyStringList;*
>
> * at
> com.amazon.transportation.ate.aas.AddressAttributeTypes$GateCodeAttribute.<init>(AddressAttributeTypes.java:3451)*
>
> * at
> com.amazon.transportation.ate.aas.AddressAttributeTypes$GateCodeAttribute.<init>(AddressAttributeTypes.java:3384)*
>
> * at
> com.amazon.transportation.ate.aas.AddressAttributeTypes$GateCodeAttribute$1.parsePartialFrom(AddressAttributeTypes.java:3475)*
>
> * at
> com.amazon.transportation.ate.aas.AddressAttributeTypes$GateCodeAttribute$1.parsePartialFrom(AddressAttributeTypes.java:3470)*
>
> * at
> com.google.protobuf.CodedInputStream.readMessage(CodedInputStream.java:309)*
>
> * at
> com.amazon.transportation.ate.aas.AddressAttributeTypes$Attribute.<init>(AddressAttributeTypes.java:961)*
>
> * at
> com.amazon.transportation.ate.aas.AddressAttributeTypes$Attribute.<init>(AddressAttributeTypes.java:862)*
>
> * at
> com.amazon.transportation.ate.aas.AddressAttributeTypes$Attribute$1.parsePartialFrom(AddressAttributeTypes.java:1051)*
>
> * at
> com.amazon.transportation.ate.aas.AddressAttributeTypes$Attribute$1.parsePartialFrom(AddressAttributeTypes.java:1046)*
>
> * at
> com.google.protobuf.CodedInputStream.readMessage(CodedInputStream.java:309)*
>
> * at
> com.amazon.transportation.ate.aas.AtePlacesExchanges$AteAddressAttributeNotification.<init>(AtePlacesExchanges.java:883)*
>
> * at
> com.amazon.transportation.ate.aas.AtePlacesExchanges$AteAddressAttributeNotification.<init>(AtePlacesExchanges.java:811)*
>
> * at
> com.amazon.transportation.ate.aas.AtePlacesExchanges$AteAddressAttributeNotification$1.parsePartialFrom(AtePlacesExchanges.java:919)*
>
> * at
> com.amazon.transportation.ate.aas.AtePlacesExchanges$AteAddressAttributeNotification$1.parsePartialFrom(AtePlacesExchanges.java:914)*
>
> * at
> com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:141)*
>
> * at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:176)*
>
> * at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:188)*
>
> * at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:193)*
>
> * at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:49)*
>
> * at
> com.amazon.transportation.ate.aas.AtePlacesExchanges$AteAddressAttributeNotification.parseFrom(AtePlacesExchanges.java:1134)*
>
> * at
> com.amazon.places.protobuf.decoder.NotificationsProtoDecoder.<init>(NotificationsProtoDecoder.java:57)*
>
> * at
> com.amazon.places.serde.AtePlacesNotificationSerde.deserialize(AtePlacesNotificationSerde.java:103)*
>
> * at
> org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:620)*
>
> * at
> org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:534)*
>
> * at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:137)*
>
> * at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1519)*
>
> * at
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:292)*
>
> * at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:227)*
>
> * at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:430)*
>
> * at
> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:803)*
>
> * at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:697)*
>
> * at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:636)*
>
> * at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)*
>
> * at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)*
>
> * at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)*
>
> * at java.lang.reflect.Method.invoke(Method.java:606)*
>
> * at org.apache.hadoop.util.RunJar.main(RunJar.java:212)*
>
>
> Please help me with the solution to my problem.
>
>
> Thanks,
>
> Kalai
>
>
>
>