You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Doug Balog <do...@dugos.com> on 2016/02/02 22:40:38 UTC

Error trying to get DF for Hive table stored HBase

I’m trying to create a DF for an external Hive table that is in HBase. 
I get the a NoSuchMethodError org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.initSerdeParams(Lorg/apache/hadoop/conf/Configuration;Ljava/util/Properties;Ljava/lang/String;)Lorg/apache/hadoop/hive/serde2/lazy/LazySimpleSerDe$SerDeParameters;

I’m running Spark 1.6.0 on HDP 2.2.4-12-1 (Hive 0.14 and HBase 0.98.4) in secure mode. 

Anybody see this before ?

Below is a stack trace and the hive table’s info.

scala> sqlContext.table("item_data_lib.pcn_item")
java.lang.NoSuchMethodError: org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.initSerdeParams(Lorg/apache/hadoop/conf/Configuration;Ljava/util/Properties;Ljava/lang/String;)Lorg/apache/hadoop/hive/serde2/lazy/LazySimpleSerDe$SerDeParameters;
	at org.apache.hadoop.hive.hbase.HBaseSerDeParameters.<init>(HBaseSerDeParameters.java:93)
	at org.apache.hadoop.hive.hbase.HBaseSerDe.initialize(HBaseSerDe.java:92)
	at org.apache.hadoop.hive.serde2.AbstractSerDe.initialize(AbstractSerDe.java:53)
	at org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDe(SerDeUtils.java:521)
	at org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:391)
	at org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:276)
	at org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:258)
	at org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:605)
	at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1$$anonfun$3.apply(ClientWrapper.scala:331)
	at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1$$anonfun$3.apply(ClientWrapper.scala:326)
	at scala.Option.map(Option.scala:145)
	at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1.apply(ClientWrapper.scala:326)
	at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1.apply(ClientWrapper.scala:321)
	at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$withHiveState$1.apply(ClientWrapper.scala:279)
	at org.apache.spark.sql.hive.client.ClientWrapper.liftedTree1$1(ClientWrapper.scala:226)
	at org.apache.spark.sql.hive.client.ClientWrapper.retryLocked(ClientWrapper.scala:225)
	at org.apache.spark.sql.hive.client.ClientWrapper.withHiveState(ClientWrapper.scala:268)
	at org.apache.spark.sql.hive.client.ClientWrapper.getTableOption(ClientWrapper.scala:321)
	at org.apache.spark.sql.hive.client.ClientInterface$class.getTable(ClientInterface.scala:122)
	at org.apache.spark.sql.hive.client.ClientWrapper.getTable(ClientWrapper.scala:60)
	at org.apache.spark.sql.hive.HiveMetastoreCatalog.lookupRelation(HiveMetastoreCatalog.scala:384)
	at org.apache.spark.sql.hive.HiveContext$$anon$2.org$apache$spark$sql$catalyst$analysis$OverrideCatalog$$super$lookupRelation(HiveContext.scala:457)
	at org.apache.spark.sql.catalyst.analysis.OverrideCatalog$class.lookupRelation(Catalog.scala:161)
	at org.apache.spark.sql.hive.HiveContext$$anon$2.lookupRelation(HiveContext.scala:457)
	at org.apache.spark.sql.SQLContext.table(SQLContext.scala:831)
	at org.apache.spark.sql.SQLContext.table(SQLContext.scala:827)


hive> show create table item_data_lib.pcn_item;
OK
CREATE EXTERNAL TABLE `item_data_lib.pcn_item`(
  `key` string COMMENT 'from deserializer',
  `p1` string COMMENT 'from deserializer',
  `p2` string COMMENT 'from deserializer',
  `p3` string COMMENT 'from deserializer',
  `p4` string COMMENT 'from deserializer',
  `p5` string COMMENT 'from deserializer',
  `p6` string COMMENT 'from deserializer',
  `p7` string COMMENT 'from deserializer',
  `p8` string COMMENT 'from deserializer',
  `p9` string COMMENT 'from deserializer',
  `p10` string COMMENT 'from deserializer',
  `p11` string COMMENT 'from deserializer',
  `p12` string COMMENT 'from deserializer',
  `p13` string COMMENT 'from deserializer',
  `d1` string COMMENT 'from deserializer',
  `d2` string COMMENT 'from deserializer',
  `d3` string COMMENT 'from deserializer',
  `d4` string COMMENT 'from deserializer',
  `d5` string COMMENT 'from deserializer',
  `d6` string COMMENT 'from deserializer',
  `d7` string COMMENT 'from deserializer',
  `d8` string COMMENT 'from deserializer',
  `d9` string COMMENT 'from deserializer',
  `d10` string COMMENT 'from deserializer',
  `d11` string COMMENT 'from deserializer',
  `d12` string COMMENT 'from deserializer',
  `d13` string COMMENT 'from deserializer',
  `d14` string COMMENT 'from deserializer',
  `d15` string COMMENT 'from deserializer',
  `d16` string COMMENT 'from deserializer',
  `d17` string COMMENT 'from deserializer')
ROW FORMAT SERDE
  'org.apache.hadoop.hive.hbase.HBaseSerDe'
STORED BY
  'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
WITH SERDEPROPERTIES (
  'hbase.columns.mapping'=':key,p:p1,p:p2,p:p3,p:p4,p:p5,p:p6,p:p7,p:p8,p:p9,p:p10,p:p11,p:p12,p:p13,d:d1,d:d2,d:d3,d:d4,d:d5,d:d6,d:d7,d:d8,d:d9,d:d10,d:d11,d:d12,d:d13,d:d14,d:d15,d:d16,d:d17',
  'serialization.format'='1')
TBLPROPERTIES (
  'hbase.table.name'='item_data_lib:pcn_item',
  'transient_lastDdlTime'='1454038459’)

Thanks,

Doug



---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Error trying to get DF for Hive table stored HBase

Posted by Ted Yu <yu...@gmail.com>.
Looks like this is related:
HIVE-12406

FYI

On Tue, Feb 2, 2016 at 1:40 PM, Doug Balog <do...@dugos.com> wrote:

> I’m trying to create a DF for an external Hive table that is in HBase.
> I get the a NoSuchMethodError
> org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.initSerdeParams(Lorg/apache/hadoop/conf/Configuration;Ljava/util/Properties;Ljava/lang/String;)Lorg/apache/hadoop/hive/serde2/lazy/LazySimpleSerDe$SerDeParameters;
>
> I’m running Spark 1.6.0 on HDP 2.2.4-12-1 (Hive 0.14 and HBase 0.98.4) in
> secure mode.
>
> Anybody see this before ?
>
> Below is a stack trace and the hive table’s info.
>
> scala> sqlContext.table("item_data_lib.pcn_item")
> java.lang.NoSuchMethodError:
> org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.initSerdeParams(Lorg/apache/hadoop/conf/Configuration;Ljava/util/Properties;Ljava/lang/String;)Lorg/apache/hadoop/hive/serde2/lazy/LazySimpleSerDe$SerDeParameters;
>         at
> org.apache.hadoop.hive.hbase.HBaseSerDeParameters.<init>(HBaseSerDeParameters.java:93)
>         at
> org.apache.hadoop.hive.hbase.HBaseSerDe.initialize(HBaseSerDe.java:92)
>         at
> org.apache.hadoop.hive.serde2.AbstractSerDe.initialize(AbstractSerDe.java:53)
>         at
> org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDe(SerDeUtils.java:521)
>         at
> org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:391)
>         at
> org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:276)
>         at
> org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:258)
>         at org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:605)
>         at
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1$$anonfun$3.apply(ClientWrapper.scala:331)
>         at
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1$$anonfun$3.apply(ClientWrapper.scala:326)
>         at scala.Option.map(Option.scala:145)
>         at
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1.apply(ClientWrapper.scala:326)
>         at
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1.apply(ClientWrapper.scala:321)
>         at
> org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$withHiveState$1.apply(ClientWrapper.scala:279)
>         at
> org.apache.spark.sql.hive.client.ClientWrapper.liftedTree1$1(ClientWrapper.scala:226)
>         at
> org.apache.spark.sql.hive.client.ClientWrapper.retryLocked(ClientWrapper.scala:225)
>         at
> org.apache.spark.sql.hive.client.ClientWrapper.withHiveState(ClientWrapper.scala:268)
>         at
> org.apache.spark.sql.hive.client.ClientWrapper.getTableOption(ClientWrapper.scala:321)
>         at
> org.apache.spark.sql.hive.client.ClientInterface$class.getTable(ClientInterface.scala:122)
>         at
> org.apache.spark.sql.hive.client.ClientWrapper.getTable(ClientWrapper.scala:60)
>         at
> org.apache.spark.sql.hive.HiveMetastoreCatalog.lookupRelation(HiveMetastoreCatalog.scala:384)
>         at org.apache.spark.sql.hive.HiveContext$$anon$2.org
> $apache$spark$sql$catalyst$analysis$OverrideCatalog$$super$lookupRelation(HiveContext.scala:457)
>         at
> org.apache.spark.sql.catalyst.analysis.OverrideCatalog$class.lookupRelation(Catalog.scala:161)
>         at
> org.apache.spark.sql.hive.HiveContext$$anon$2.lookupRelation(HiveContext.scala:457)
>         at org.apache.spark.sql.SQLContext.table(SQLContext.scala:831)
>         at org.apache.spark.sql.SQLContext.table(SQLContext.scala:827)
>
>
> hive> show create table item_data_lib.pcn_item;
> OK
> CREATE EXTERNAL TABLE `item_data_lib.pcn_item`(
>   `key` string COMMENT 'from deserializer',
>   `p1` string COMMENT 'from deserializer',
>   `p2` string COMMENT 'from deserializer',
>   `p3` string COMMENT 'from deserializer',
>   `p4` string COMMENT 'from deserializer',
>   `p5` string COMMENT 'from deserializer',
>   `p6` string COMMENT 'from deserializer',
>   `p7` string COMMENT 'from deserializer',
>   `p8` string COMMENT 'from deserializer',
>   `p9` string COMMENT 'from deserializer',
>   `p10` string COMMENT 'from deserializer',
>   `p11` string COMMENT 'from deserializer',
>   `p12` string COMMENT 'from deserializer',
>   `p13` string COMMENT 'from deserializer',
>   `d1` string COMMENT 'from deserializer',
>   `d2` string COMMENT 'from deserializer',
>   `d3` string COMMENT 'from deserializer',
>   `d4` string COMMENT 'from deserializer',
>   `d5` string COMMENT 'from deserializer',
>   `d6` string COMMENT 'from deserializer',
>   `d7` string COMMENT 'from deserializer',
>   `d8` string COMMENT 'from deserializer',
>   `d9` string COMMENT 'from deserializer',
>   `d10` string COMMENT 'from deserializer',
>   `d11` string COMMENT 'from deserializer',
>   `d12` string COMMENT 'from deserializer',
>   `d13` string COMMENT 'from deserializer',
>   `d14` string COMMENT 'from deserializer',
>   `d15` string COMMENT 'from deserializer',
>   `d16` string COMMENT 'from deserializer',
>   `d17` string COMMENT 'from deserializer')
> ROW FORMAT SERDE
>   'org.apache.hadoop.hive.hbase.HBaseSerDe'
> STORED BY
>   'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
> WITH SERDEPROPERTIES (
>
> 'hbase.columns.mapping'=':key,p:p1,p:p2,p:p3,p:p4,p:p5,p:p6,p:p7,p:p8,p:p9,p:p10,p:p11,p:p12,p:p13,d:d1,d:d2,d:d3,d:d4,d:d5,d:d6,d:d7,d:d8,d:d9,d:d10,d:d11,d:d12,d:d13,d:d14,d:d15,d:d16,d:d17',
>   'serialization.format'='1')
> TBLPROPERTIES (
>   'hbase.table.name'='item_data_lib:pcn_item',
>   'transient_lastDdlTime'='1454038459’)
>
> Thanks,
>
> Doug
>
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>