You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Mich Talebzadeh <mi...@gmail.com> on 2016/10/07 08:27:53 UTC

issue accessing Phoenix table from Spark

Hi,

my code is trying to load a phoenix table built on an Hbase table.

import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.hbase.HBaseConfiguration
import org.apache.hadoop.hbase.HColumnDescriptor
import org.apache.hadoop.hbase.HTableDescriptor
import org.apache.hadoop.hbase.{ HBaseConfiguration, HColumnDescriptor,
HTableDescriptor }
import org.apache.hadoop.mapred.JobConf
import org.apache.hadoop.hbase.client.HBaseAdmin
import org.apache.spark.sql.types._
import org.apache.phoenix.spark._


The code line is from https://phoenix.apache.org/phoenix_spark.html

scala> val HiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
warning: there was one deprecation warning; re-run with -deprecation for
details
HiveContext: org.apache.spark.sql.hive.HiveContext =
org.apache.spark.sql.hive.HiveContext@533e8807
scala> val df = HiveContext.load(
     | "org.apache.phoenix.spark",
     | Map("table" -> "temptable", "zkUrl" -> "rhes564:2181")
     |   )

warning: there was one deprecation warning; re-run with -deprecation for
details
java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
  at
org.apache.phoenix.spark.PhoenixRDD.getPhoenixConfiguration(PhoenixRDD.scala:71)
  at
org.apache.phoenix.spark.PhoenixRDD.phoenixConf$lzycompute(PhoenixRDD.scala:39)
  at org.apache.phoenix.spark.PhoenixRDD.phoenixConf(PhoenixRDD.scala:38)
  at org.apache.phoenix.spark.PhoenixRDD.<init>(PhoenixRDD.scala:42)
  at
org.apache.phoenix.spark.PhoenixRelation.schema(PhoenixRelation.scala:50)
  at
org.apache.spark.sql.execution.datasources.LogicalRelation.<init>(LogicalRelation.scala:40)
  at
org.apache.spark.sql.SparkSession.baseRelationToDataFrame(SparkSession.scala:382)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:143)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:122)
  at org.apache.spark.sql.SQLContext.load(SQLContext.scala:958)
  ... 55 elided
I tried this as a fat jar file building it with Maven but I still get the
same error.

Also the original code looks like this

val df = sqlContext.load(
  "org.apache.phoenix.spark",
  Map("table" -> "TABLE1", "zkUrl" -> "phoenix-server:2181")
)

Thanks








Dr Mich Talebzadeh



LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.

Re: issue accessing Phoenix table from Spark

Posted by Jörn Franke <jo...@gmail.com>.
Have you verified that this class is in the fat jar? It looks that it misses some of the Hbase libraries ... 

> On 21 Oct 2016, at 11:45, Mich Talebzadeh <mi...@gmail.com> wrote:
> 
> Still does not work with Spark 2.0.0 on apache-phoenix-4.8.1-HBase-1.2-bin
> 
> thanks
> 
> Dr Mich Talebzadeh
>  
> LinkedIn  https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>  
> http://talebzadehmich.wordpress.com
> 
> Disclaimer: Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction.
>  
> 
>> On 7 October 2016 at 09:27, Mich Talebzadeh <mi...@gmail.com> wrote:
>> Hi,
>> 
>> my code is trying to load a phoenix table built on an Hbase table.
>> 
>> import org.apache.spark.SparkContext
>> import org.apache.spark.SparkConf
>> import org.apache.hadoop.conf.Configuration
>> import org.apache.hadoop.hbase.HBaseConfiguration
>> import org.apache.hadoop.hbase.HColumnDescriptor
>> import org.apache.hadoop.hbase.HTableDescriptor
>> import org.apache.hadoop.hbase.{ HBaseConfiguration, HColumnDescriptor, HTableDescriptor }
>> import org.apache.hadoop.mapred.JobConf
>> import org.apache.hadoop.hbase.client.HBaseAdmin
>> import org.apache.spark.sql.types._
>> import org.apache.phoenix.spark._
>> 
>> 
>> The code line is from https://phoenix.apache.org/phoenix_spark.html
>> 
>> scala> val HiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
>> warning: there was one deprecation warning; re-run with -deprecation for details
>> HiveContext: org.apache.spark.sql.hive.HiveContext = org.apache.spark.sql.hive.HiveContext@533e8807
>> 
>> scala> val df = HiveContext.load(
>>      | "org.apache.phoenix.spark",
>>      | Map("table" -> "temptable", "zkUrl" -> "rhes564:2181")
>>      |   )
>> 
>> warning: there was one deprecation warning; re-run with -deprecation for details
>> java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
>>   at org.apache.phoenix.spark.PhoenixRDD.getPhoenixConfiguration(PhoenixRDD.scala:71)
>>   at org.apache.phoenix.spark.PhoenixRDD.phoenixConf$lzycompute(PhoenixRDD.scala:39)
>>   at org.apache.phoenix.spark.PhoenixRDD.phoenixConf(PhoenixRDD.scala:38)
>>   at org.apache.phoenix.spark.PhoenixRDD.<init>(PhoenixRDD.scala:42)
>>   at org.apache.phoenix.spark.PhoenixRelation.schema(PhoenixRelation.scala:50)
>>   at org.apache.spark.sql.execution.datasources.LogicalRelation.<init>(LogicalRelation.scala:40)
>>   at org.apache.spark.sql.SparkSession.baseRelationToDataFrame(SparkSession.scala:382)
>>   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:143)
>>   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:122)
>>   at org.apache.spark.sql.SQLContext.load(SQLContext.scala:958)
>>   ... 55 elided
>> I tried this as a fat jar file building it with Maven but I still get the same error.
>> 
>> Also the original code looks like this
>> 
>> val df = sqlContext.load(
>>   "org.apache.phoenix.spark",
>>   Map("table" -> "TABLE1", "zkUrl" -> "phoenix-server:2181")
>> )
>> 
>> Thanks
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> Dr Mich Talebzadeh
>>  
>> LinkedIn  https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>>  
>> http://talebzadehmich.wordpress.com
>> 
>> Disclaimer: Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction.
>>  
> 

Re: issue accessing Phoenix table from Spark

Posted by Mich Talebzadeh <mi...@gmail.com>.
Still does not work with Spark 2.0.0 on apache-phoenix-4.8.1-HBase-1.2-bin

thanks

Dr Mich Talebzadeh



LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 7 October 2016 at 09:27, Mich Talebzadeh <mi...@gmail.com>
wrote:

> Hi,
>
> my code is trying to load a phoenix table built on an Hbase table.
>
> import org.apache.spark.SparkContext
> import org.apache.spark.SparkConf
> import org.apache.hadoop.conf.Configuration
> import org.apache.hadoop.hbase.HBaseConfiguration
> import org.apache.hadoop.hbase.HColumnDescriptor
> import org.apache.hadoop.hbase.HTableDescriptor
> import org.apache.hadoop.hbase.{ HBaseConfiguration, HColumnDescriptor,
> HTableDescriptor }
> import org.apache.hadoop.mapred.JobConf
> import org.apache.hadoop.hbase.client.HBaseAdmin
> import org.apache.spark.sql.types._
> import org.apache.phoenix.spark._
>
>
> The code line is from https://phoenix.apache.org/phoenix_spark.html
>
> scala> val HiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
> warning: there was one deprecation warning; re-run with -deprecation for
> details
> HiveContext: org.apache.spark.sql.hive.HiveContext =
> org.apache.spark.sql.hive.HiveContext@533e8807
> scala> val df = HiveContext.load(
>      | "org.apache.phoenix.spark",
>      | Map("table" -> "temptable", "zkUrl" -> "rhes564:2181")
>      |   )
>
> warning: there was one deprecation warning; re-run with -deprecation for
> details
> java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
>   at org.apache.phoenix.spark.PhoenixRDD.getPhoenixConfiguration(
> PhoenixRDD.scala:71)
>   at org.apache.phoenix.spark.PhoenixRDD.phoenixConf$
> lzycompute(PhoenixRDD.scala:39)
>   at org.apache.phoenix.spark.PhoenixRDD.phoenixConf(PhoenixRDD.scala:38)
>   at org.apache.phoenix.spark.PhoenixRDD.<init>(PhoenixRDD.scala:42)
>   at org.apache.phoenix.spark.PhoenixRelation.schema(
> PhoenixRelation.scala:50)
>   at org.apache.spark.sql.execution.datasources.LogicalRelation.<init>(
> LogicalRelation.scala:40)
>   at org.apache.spark.sql.SparkSession.baseRelationToDataFrame(
> SparkSession.scala:382)
>   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:143)
>   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:122)
>   at org.apache.spark.sql.SQLContext.load(SQLContext.scala:958)
>   ... 55 elided
> I tried this as a fat jar file building it with Maven but I still get the
> same error.
>
> Also the original code looks like this
>
> val df = sqlContext.load(
>   "org.apache.phoenix.spark",
>   Map("table" -> "TABLE1", "zkUrl" -> "phoenix-server:2181")
> )
>
> Thanks
>
>
>
>
>
>
>
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>
>
>
> http://talebzadehmich.wordpress.com
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>