You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-user@hadoop.apache.org by Divya Gehlot <di...@gmail.com> on 2016/02/29 10:48:01 UTC
[Error]: Spark 1.5.2 + HiveHbase Integration
Hi,
I am trying to access hive table which been created using HbaseIntegration
<https://cwiki.apache.org/confluence/display/Hive/HBaseIntegration#HBaseIntegration-Introduction>
I am able to access data in Hive CLI
But when I am trying to access the table using hivecontext of Spark
getting following error
> java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/util/Bytes
> at
> org.apache.hadoop.hive.hbase.HBaseSerDe.parseColumnsMapping(HBaseSerDe.java:184)
> at
> org.apache.hadoop.hive.hbase.HBaseSerDeParameters.<init>(HBaseSerDeParameters.java:73)
> at
> org.apache.hadoop.hive.hbase.HBaseSerDe.initialize(HBaseSerDe.java:117)
> at
> org.apache.hadoop.hive.serde2.AbstractSerDe.initialize(AbstractSerDe.java:53)
> at
> org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDe(SerDeUtils.java:521)
> at
> org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:391)
> at
> org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:276)
> at
> org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:258)
> at org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:605)
> at
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1$$anonfun$3.apply(ClientWrapper.scala:330)
> at
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1$$anonfun$3.apply(ClientWrapper.scala:325)
Have added following jars to Spark class path :
/usr/hdp/2.3.4.0-3485/hive/lib/hive-hbase-handler.jar,
/usr/hdp/2.3.4.0-3485/hive/lib/zookeeper-3.4.6.2.3.4.0-3485.jar,
/usr/hdp/2.3.4.0-3485/hive/lib/guava-14.0.1.jar,
/usr/hdp/2.3.4.0-3485/hive/lib/protobuf-java-2.5.0.jar
Which jar files am I missing ??
Thanks,
Regards,
Divya
Re: [Error]: Spark 1.5.2 + HiveHbase Integration
Posted by "mohit.kaushik" <mo...@orkash.com>.
Don't you think, you need a HBase jar?
On 02/29/2016 03:18 PM, Divya Gehlot wrote:
> Hi,
> I am trying to access hive table which been created using
> HbaseIntegration
> <https://cwiki.apache.org/confluence/display/Hive/HBaseIntegration#HBaseIntegration-Introduction>
> I am able to access data in Hive CLI
> But when I am trying to access the table using hivecontext of Spark
> getting following error
>
> java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/util/Bytes
> at
> org.apache.hadoop.hive.hbase.HBaseSerDe.parseColumnsMapping(HBaseSerDe.java:184)
> at
> org.apache.hadoop.hive.hbase.HBaseSerDeParameters.<init>(HBaseSerDeParameters.java:73)
> at
> org.apache.hadoop.hive.hbase.HBaseSerDe.initialize(HBaseSerDe.java:117)
> at
> org.apache.hadoop.hive.serde2.AbstractSerDe.initialize(AbstractSerDe.java:53)
> at
> org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDe(SerDeUtils.java:521)
> at
> org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:391)
> at
> org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:276)
> at
> org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:258)
> at
> org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:605)
> at
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1$$anonfun$3.apply(ClientWrapper.scala:330)
> at
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1$$anonfun$3.apply(ClientWrapper.scala:325)
>
>
>
> Have added following jars to Spark class path :
> /usr/hdp/2.3.4.0-3485/hive/lib/hive-hbase-handler.jar,
> /usr/hdp/2.3.4.0-3485/hive/lib/zookeeper-3.4.6.2.3.4.0-3485.jar,
> /usr/hdp/2.3.4.0-3485/hive/lib/guava-14.0.1.jar,
> /usr/hdp/2.3.4.0-3485/hive/lib/protobuf-java-2.5.0.jar
>
> Which jar files am I missing ??
>
>
> Thanks,
> Regards,
> Divya
--
Signature
*Mohit Kaushik*
Software Engineer
A Square,Plot No. 278, Udyog Vihar, Phase 2, Gurgaon 122016, India
*Tel:*+91 (124) 4969352 | *Fax:*+91 (124) 4033553
<http://politicomapper.orkash.com>interactive social intelligence at work...
<https://www.facebook.com/Orkash2012>
<http://www.linkedin.com/company/orkash-services-private-limited>
<https://twitter.com/Orkash> <http://www.orkash.com/blog/>
<http://www.orkash.com>
<http://www.orkash.com> ... ensuring Assurance in complexity and uncertainty
/This message including the attachments, if any, is a confidential
business communication. If you are not the intended recipient it may be
unlawful for you to read, copy, distribute, disclose or otherwise use
the information in this e-mail. If you have received it in error or are
not the intended recipient, please destroy it and notify the sender
immediately. Thank you /
Re: [Error]: Spark 1.5.2 + HiveHbase Integration
Posted by "mohit.kaushik" <mo...@orkash.com>.
Don't you think, you need a HBase jar?
On 02/29/2016 03:18 PM, Divya Gehlot wrote:
> Hi,
> I am trying to access hive table which been created using
> HbaseIntegration
> <https://cwiki.apache.org/confluence/display/Hive/HBaseIntegration#HBaseIntegration-Introduction>
> I am able to access data in Hive CLI
> But when I am trying to access the table using hivecontext of Spark
> getting following error
>
> java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/util/Bytes
> at
> org.apache.hadoop.hive.hbase.HBaseSerDe.parseColumnsMapping(HBaseSerDe.java:184)
> at
> org.apache.hadoop.hive.hbase.HBaseSerDeParameters.<init>(HBaseSerDeParameters.java:73)
> at
> org.apache.hadoop.hive.hbase.HBaseSerDe.initialize(HBaseSerDe.java:117)
> at
> org.apache.hadoop.hive.serde2.AbstractSerDe.initialize(AbstractSerDe.java:53)
> at
> org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDe(SerDeUtils.java:521)
> at
> org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:391)
> at
> org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:276)
> at
> org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:258)
> at
> org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:605)
> at
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1$$anonfun$3.apply(ClientWrapper.scala:330)
> at
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1$$anonfun$3.apply(ClientWrapper.scala:325)
>
>
>
> Have added following jars to Spark class path :
> /usr/hdp/2.3.4.0-3485/hive/lib/hive-hbase-handler.jar,
> /usr/hdp/2.3.4.0-3485/hive/lib/zookeeper-3.4.6.2.3.4.0-3485.jar,
> /usr/hdp/2.3.4.0-3485/hive/lib/guava-14.0.1.jar,
> /usr/hdp/2.3.4.0-3485/hive/lib/protobuf-java-2.5.0.jar
>
> Which jar files am I missing ??
>
>
> Thanks,
> Regards,
> Divya
--
Signature
*Mohit Kaushik*
Software Engineer
A Square,Plot No. 278, Udyog Vihar, Phase 2, Gurgaon 122016, India
*Tel:*+91 (124) 4969352 | *Fax:*+91 (124) 4033553
<http://politicomapper.orkash.com>interactive social intelligence at work...
<https://www.facebook.com/Orkash2012>
<http://www.linkedin.com/company/orkash-services-private-limited>
<https://twitter.com/Orkash> <http://www.orkash.com/blog/>
<http://www.orkash.com>
<http://www.orkash.com> ... ensuring Assurance in complexity and uncertainty
/This message including the attachments, if any, is a confidential
business communication. If you are not the intended recipient it may be
unlawful for you to read, copy, distribute, disclose or otherwise use
the information in this e-mail. If you have received it in error or are
not the intended recipient, please destroy it and notify the sender
immediately. Thank you /
Re: [Error]: Spark 1.5.2 + HiveHbase Integration
Posted by Ted Yu <yu...@gmail.com>.
Divya:
Please try not to cross post your question.
In your case HBase-common jar is needed. To find all the hbase jars needed, you can run 'mvn dependency:tree' and check its output.
> On Feb 29, 2016, at 1:48 AM, Divya Gehlot <di...@gmail.com> wrote:
>
> Hi,
> I am trying to access hive table which been created using HbaseIntegration
> I am able to access data in Hive CLI
> But when I am trying to access the table using hivecontext of Spark
> getting following error
>> java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/util/Bytes
>> at org.apache.hadoop.hive.hbase.HBaseSerDe.parseColumnsMapping(HBaseSerDe.java:184)
>> at org.apache.hadoop.hive.hbase.HBaseSerDeParameters.<init>(HBaseSerDeParameters.java:73)
>> at org.apache.hadoop.hive.hbase.HBaseSerDe.initialize(HBaseSerDe.java:117)
>> at org.apache.hadoop.hive.serde2.AbstractSerDe.initialize(AbstractSerDe.java:53)
>> at org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDe(SerDeUtils.java:521)
>> at org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:391)
>> at org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:276)
>> at org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:258)
>> at org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:605)
>> at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1$$anonfun$3.apply(ClientWrapper.scala:330)
>> at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1$$anonfun$3.apply(ClientWrapper.scala:325)
>
>
>
> Have added following jars to Spark class path :
> /usr/hdp/2.3.4.0-3485/hive/lib/hive-hbase-handler.jar,
> /usr/hdp/2.3.4.0-3485/hive/lib/zookeeper-3.4.6.2.3.4.0-3485.jar,
> /usr/hdp/2.3.4.0-3485/hive/lib/guava-14.0.1.jar,
> /usr/hdp/2.3.4.0-3485/hive/lib/protobuf-java-2.5.0.jar
>
> Which jar files am I missing ??
>
>
> Thanks,
> Regards,
> Divya
Re: [Error]: Spark 1.5.2 + HiveHbase Integration
Posted by "mohit.kaushik" <mo...@orkash.com>.
Don't you think, you need a HBase jar?
On 02/29/2016 03:18 PM, Divya Gehlot wrote:
> Hi,
> I am trying to access hive table which been created using
> HbaseIntegration
> <https://cwiki.apache.org/confluence/display/Hive/HBaseIntegration#HBaseIntegration-Introduction>
> I am able to access data in Hive CLI
> But when I am trying to access the table using hivecontext of Spark
> getting following error
>
> java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/util/Bytes
> at
> org.apache.hadoop.hive.hbase.HBaseSerDe.parseColumnsMapping(HBaseSerDe.java:184)
> at
> org.apache.hadoop.hive.hbase.HBaseSerDeParameters.<init>(HBaseSerDeParameters.java:73)
> at
> org.apache.hadoop.hive.hbase.HBaseSerDe.initialize(HBaseSerDe.java:117)
> at
> org.apache.hadoop.hive.serde2.AbstractSerDe.initialize(AbstractSerDe.java:53)
> at
> org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDe(SerDeUtils.java:521)
> at
> org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:391)
> at
> org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:276)
> at
> org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:258)
> at
> org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:605)
> at
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1$$anonfun$3.apply(ClientWrapper.scala:330)
> at
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1$$anonfun$3.apply(ClientWrapper.scala:325)
>
>
>
> Have added following jars to Spark class path :
> /usr/hdp/2.3.4.0-3485/hive/lib/hive-hbase-handler.jar,
> /usr/hdp/2.3.4.0-3485/hive/lib/zookeeper-3.4.6.2.3.4.0-3485.jar,
> /usr/hdp/2.3.4.0-3485/hive/lib/guava-14.0.1.jar,
> /usr/hdp/2.3.4.0-3485/hive/lib/protobuf-java-2.5.0.jar
>
> Which jar files am I missing ??
>
>
> Thanks,
> Regards,
> Divya
--
Signature
*Mohit Kaushik*
Software Engineer
A Square,Plot No. 278, Udyog Vihar, Phase 2, Gurgaon 122016, India
*Tel:*+91 (124) 4969352 | *Fax:*+91 (124) 4033553
<http://politicomapper.orkash.com>interactive social intelligence at work...
<https://www.facebook.com/Orkash2012>
<http://www.linkedin.com/company/orkash-services-private-limited>
<https://twitter.com/Orkash> <http://www.orkash.com/blog/>
<http://www.orkash.com>
<http://www.orkash.com> ... ensuring Assurance in complexity and uncertainty
/This message including the attachments, if any, is a confidential
business communication. If you are not the intended recipient it may be
unlawful for you to read, copy, distribute, disclose or otherwise use
the information in this e-mail. If you have received it in error or are
not the intended recipient, please destroy it and notify the sender
immediately. Thank you /
Re: [Error]: Spark 1.5.2 + HiveHbase Integration
Posted by "mohit.kaushik" <mo...@orkash.com>.
Don't you think, you need a HBase jar?
On 02/29/2016 03:18 PM, Divya Gehlot wrote:
> Hi,
> I am trying to access hive table which been created using
> HbaseIntegration
> <https://cwiki.apache.org/confluence/display/Hive/HBaseIntegration#HBaseIntegration-Introduction>
> I am able to access data in Hive CLI
> But when I am trying to access the table using hivecontext of Spark
> getting following error
>
> java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/util/Bytes
> at
> org.apache.hadoop.hive.hbase.HBaseSerDe.parseColumnsMapping(HBaseSerDe.java:184)
> at
> org.apache.hadoop.hive.hbase.HBaseSerDeParameters.<init>(HBaseSerDeParameters.java:73)
> at
> org.apache.hadoop.hive.hbase.HBaseSerDe.initialize(HBaseSerDe.java:117)
> at
> org.apache.hadoop.hive.serde2.AbstractSerDe.initialize(AbstractSerDe.java:53)
> at
> org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDe(SerDeUtils.java:521)
> at
> org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:391)
> at
> org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:276)
> at
> org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:258)
> at
> org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:605)
> at
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1$$anonfun$3.apply(ClientWrapper.scala:330)
> at
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1$$anonfun$3.apply(ClientWrapper.scala:325)
>
>
>
> Have added following jars to Spark class path :
> /usr/hdp/2.3.4.0-3485/hive/lib/hive-hbase-handler.jar,
> /usr/hdp/2.3.4.0-3485/hive/lib/zookeeper-3.4.6.2.3.4.0-3485.jar,
> /usr/hdp/2.3.4.0-3485/hive/lib/guava-14.0.1.jar,
> /usr/hdp/2.3.4.0-3485/hive/lib/protobuf-java-2.5.0.jar
>
> Which jar files am I missing ??
>
>
> Thanks,
> Regards,
> Divya
--
Signature
*Mohit Kaushik*
Software Engineer
A Square,Plot No. 278, Udyog Vihar, Phase 2, Gurgaon 122016, India
*Tel:*+91 (124) 4969352 | *Fax:*+91 (124) 4033553
<http://politicomapper.orkash.com>interactive social intelligence at work...
<https://www.facebook.com/Orkash2012>
<http://www.linkedin.com/company/orkash-services-private-limited>
<https://twitter.com/Orkash> <http://www.orkash.com/blog/>
<http://www.orkash.com>
<http://www.orkash.com> ... ensuring Assurance in complexity and uncertainty
/This message including the attachments, if any, is a confidential
business communication. If you are not the intended recipient it may be
unlawful for you to read, copy, distribute, disclose or otherwise use
the information in this e-mail. If you have received it in error or are
not the intended recipient, please destroy it and notify the sender
immediately. Thank you /