You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "prasannaP (JIRA)" <ji...@apache.org> on 2016/09/02 05:29:20 UTC

[jira] [Created] (SPARK-17373) spark+hive+hbase+hbaseIntegration not working

prasannaP created SPARK-17373:
---------------------------------

             Summary: spark+hive+hbase+hbaseIntegration not working
                 Key: SPARK-17373
                 URL: https://issues.apache.org/jira/browse/SPARK-17373
             Project: Spark
          Issue Type: Bug
          Components: Spark Shell
            Reporter: prasannaP


SparkSQL+Hive+Hbase+HbaseIntegration doesn't work


Hi,
I am getting error when I am trying to connect hive table (which is being
created through HbaseIntegration) in spark

Steps I followed :
*Hive Table creation code *:
CREATE TABLE test.sample(id string,name string) 
STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,details:name")
TBLPROPERTIES ("hbase.table.name" = "sample");

*DESCRIBE TEST ;*
col_name data_type comment
id string from deserializer
name string from deserializer

*Starting Spark shell*
spark-shell --master local[2] --driver-class-path /usr/local/hive/lib/hive-hbase-handler-1.2.1.jar:
/usr/local/hbase/lib/hbase-server-0.98.9-hadoop2.jar:/usr/local/hbase/lib/hbase-protocol-0.98.9-hadoo2.jar:
/usr/local/hbase/lib/hbase-hadoop2-compat-0.98.9-hadoop2.jar:/usr/local/hbase/lib/hbase-hadoop-compat-0.98.9-hadoop2.jar:
/usr/local/hbase/lib/hbase-client-0.98.9-hadoop2.jar:/usr/local/hbase/lib/hbase-common-0.98.9-hadoop2.jar:
/usr/local/hbase/lib/htrace-core-2.04.jar:/usr/local/hbase/lib/hbase-common-0.98.9-hadoop2-tests.jar:
/usr/local/hbase/lib/hbase-server-0.98.9-hadoop2-tests.jar:/usr/local/hive/lib/zookeeper-3.4.6.jar:/usr/local/hive/lib/guava-14.0.1.jar


In spark-shell:

val sqlContext=new org.apache.spark.sql.hive.HiveContext(sc)
sqlContext.sql(“select count(*) from test.sample”).collect()

I added this setting in hadoop-env.sh as

export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$HBASE_HOME/lib/*

*Stack Trace* :

Stack SQL context available as sqlContext.

java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/util/Bytes 


Could somebody help me in resolving the error.
Would really appreciate the help .
Thank you.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org