You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Mich Talebzadeh <mi...@gmail.com> on 2016/10/24 15:22:42 UTC

Accessing Phoenix table from Spark 2.0., any cure!

My stack is this

Spark: Spark 2.0.0
Zookeeper: ZooKeeper 3.4.6
Hbase: hbase-1.2.3
Phoenix: apache-phoenix-4.8.1-HBase-1.2-bin

I am running this simple code

scala> val df = sqlContext.load("org.apache.phoenix.spark",
     | Map("table" -> "MARKETDATAHBASE", "zkUrl" -> "rhes564:2181")
     | )

java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
  at
org.apache.phoenix.spark.PhoenixRDD.getPhoenixConfiguration(PhoenixRDD.scala:71)
  at
org.apache.phoenix.spark.PhoenixRDD.phoenixConf$lzycompute(PhoenixRDD.scala:39)
  at org.apache.phoenix.spark.PhoenixRDD.phoenixConf(PhoenixRDD.scala:38)
  at org.apache.phoenix.spark.PhoenixRDD.<init>(PhoenixRDD.scala:42)
  at
org.apache.phoenix.spark.PhoenixRelation.schema(PhoenixRelation.scala:50)
  at
org.apache.spark.sql.execution.datasources.LogicalRelation.<init>(LogicalRelation.scala:40)
  at
org.apache.spark.sql.SparkSession.baseRelationToDataFrame(SparkSession.scala:382)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:143)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:122)
  at org.apache.spark.sql.SQLContext.load(SQLContext.scala:958)
  ... 54 elided

Thanks

Dr Mich Talebzadeh



LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.

Re: Accessing Phoenix table from Spark 2.0., any cure!

Posted by Josh Mahonin <jm...@gmail.com>.
Hi Mich,

If you're having the exact same classpath error as that link, it's likely
that you're not including the phoenix client JAR for your Spark
driver/executor classpath settings. In previous Phoenix releases, it was
necessary to use a specific phoenix-client-spark assembly JAR, but as of
4.8.0 the regular client JAR should suffice. Note that the 'phoenix-client'
compiled lib is insufficient, as it's missing many of the other
dependencies, such as HBaseConfiguration.class.

However, I see you're trying with Spark 2.0, so even when you get the
classpath error sorted, it likely will not work, as per:
https://issues.apache.org/jira/browse/PHOENIX-3333

Spark modified the DataFrame API, introducing an incompatibility in 2.0.
There's follow-up work in Phoenix to fix that compatibility. Patches are
most welcome!

Best,

Josh


On Mon, Oct 24, 2016 at 3:45 PM, Mich Talebzadeh <mi...@gmail.com>
wrote:

> Hi Ted,
>
> No joy even after adding hbase-common-1.2.3.jar to HADOOP_CLASSPATH and
> CLASSPATH.
>
> Still getting error. This link
> <https://mail-archives.apache.org/mod_mbox/phoenix-user/201607.mbox/%3CCAGYyBgim3LUvBu-6T3u-KGKtBz8P+R_oOGEUj=-uJqGrZ4Yjpw@mail.gmail.com%3E>
> shows the same issue.
>
> Thanks
>
>
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>
>
>
> http://talebzadehmich.wordpress.com
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
> On 24 October 2016 at 20:07, Ted Yu <yu...@gmail.com> wrote:
>
>> HBaseConfiguration is in hbase-common module.
>>
>> See if hbase-common jar is on the classpath.
>>
>> On Mon, Oct 24, 2016 at 8:22 AM, Mich Talebzadeh <
>> mich.talebzadeh@gmail.com> wrote:
>>
>>> My stack is this
>>>
>>> Spark: Spark 2.0.0
>>> Zookeeper: ZooKeeper 3.4.6
>>> Hbase: hbase-1.2.3
>>> Phoenix: apache-phoenix-4.8.1-HBase-1.2-bin
>>>
>>> I am running this simple code
>>>
>>> scala> val df = sqlContext.load("org.apache.phoenix.spark",
>>>      | Map("table" -> "MARKETDATAHBASE", "zkUrl" -> "rhes564:2181")
>>>      | )
>>>
>>> java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseC
>>> onfiguration
>>>   at org.apache.phoenix.spark.PhoenixRDD.getPhoenixConfiguration(
>>> PhoenixRDD.scala:71)
>>>   at org.apache.phoenix.spark.PhoenixRDD.phoenixConf$lzycompute(P
>>> hoenixRDD.scala:39)
>>>   at org.apache.phoenix.spark.PhoenixRDD.phoenixConf(PhoenixRDD.s
>>> cala:38)
>>>   at org.apache.phoenix.spark.PhoenixRDD.<init>(PhoenixRDD.scala:42)
>>>   at org.apache.phoenix.spark.PhoenixRelation.schema(PhoenixRelat
>>> ion.scala:50)
>>>   at org.apache.spark.sql.execution.datasources.LogicalRelation.<
>>> init>(LogicalRelation.scala:40)
>>>   at org.apache.spark.sql.SparkSession.baseRelationToDataFrame(Sp
>>> arkSession.scala:382)
>>>   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.sc
>>> ala:143)
>>>   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.sc
>>> ala:122)
>>>   at org.apache.spark.sql.SQLContext.load(SQLContext.scala:958)
>>>   ... 54 elided
>>>
>>> Thanks
>>>
>>> Dr Mich Talebzadeh
>>>
>>>
>>>
>>> LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>>
>>>
>>>
>>> http://talebzadehmich.wordpress.com
>>>
>>>
>>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>>> any loss, damage or destruction of data or any other property which may
>>> arise from relying on this email's technical content is explicitly
>>> disclaimed. The author will in no case be liable for any monetary damages
>>> arising from such loss, damage or destruction.
>>>
>>>
>>>
>>
>>
>

Re: Accessing Phoenix table from Spark 2.0., any cure!

Posted by Mich Talebzadeh <mi...@gmail.com>.
Hi Ted,

No joy even after adding hbase-common-1.2.3.jar to HADOOP_CLASSPATH and
CLASSPATH.

Still getting error. This link
<https://mail-archives.apache.org/mod_mbox/phoenix-user/201607.mbox/%3CCAGYyBgim3LUvBu-6T3u-KGKtBz8P+R_oOGEUj=-uJqGrZ4Yjpw@mail.gmail.com%3E>
shows the same issue.

Thanks



Dr Mich Talebzadeh



LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 24 October 2016 at 20:07, Ted Yu <yu...@gmail.com> wrote:

> HBaseConfiguration is in hbase-common module.
>
> See if hbase-common jar is on the classpath.
>
> On Mon, Oct 24, 2016 at 8:22 AM, Mich Talebzadeh <
> mich.talebzadeh@gmail.com> wrote:
>
>> My stack is this
>>
>> Spark: Spark 2.0.0
>> Zookeeper: ZooKeeper 3.4.6
>> Hbase: hbase-1.2.3
>> Phoenix: apache-phoenix-4.8.1-HBase-1.2-bin
>>
>> I am running this simple code
>>
>> scala> val df = sqlContext.load("org.apache.phoenix.spark",
>>      | Map("table" -> "MARKETDATAHBASE", "zkUrl" -> "rhes564:2181")
>>      | )
>>
>> java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseC
>> onfiguration
>>   at org.apache.phoenix.spark.PhoenixRDD.getPhoenixConfiguration(
>> PhoenixRDD.scala:71)
>>   at org.apache.phoenix.spark.PhoenixRDD.phoenixConf$lzycompute(
>> PhoenixRDD.scala:39)
>>   at org.apache.phoenix.spark.PhoenixRDD.phoenixConf(PhoenixRDD.scala:38)
>>   at org.apache.phoenix.spark.PhoenixRDD.<init>(PhoenixRDD.scala:42)
>>   at org.apache.phoenix.spark.PhoenixRelation.schema(PhoenixRelat
>> ion.scala:50)
>>   at org.apache.spark.sql.execution.datasources.LogicalRelation.<
>> init>(LogicalRelation.scala:40)
>>   at org.apache.spark.sql.SparkSession.baseRelationToDataFrame(Sp
>> arkSession.scala:382)
>>   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:143)
>>   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:122)
>>   at org.apache.spark.sql.SQLContext.load(SQLContext.scala:958)
>>   ... 54 elided
>>
>> Thanks
>>
>> Dr Mich Talebzadeh
>>
>>
>>
>> LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>
>>
>>
>> http://talebzadehmich.wordpress.com
>>
>>
>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>> any loss, damage or destruction of data or any other property which may
>> arise from relying on this email's technical content is explicitly
>> disclaimed. The author will in no case be liable for any monetary damages
>> arising from such loss, damage or destruction.
>>
>>
>>
>
>

Re: Accessing Phoenix table from Spark 2.0., any cure!

Posted by Ted Yu <yu...@gmail.com>.
HBaseConfiguration is in hbase-common module.

See if hbase-common jar is on the classpath.

On Mon, Oct 24, 2016 at 8:22 AM, Mich Talebzadeh <mi...@gmail.com>
wrote:

> My stack is this
>
> Spark: Spark 2.0.0
> Zookeeper: ZooKeeper 3.4.6
> Hbase: hbase-1.2.3
> Phoenix: apache-phoenix-4.8.1-HBase-1.2-bin
>
> I am running this simple code
>
> scala> val df = sqlContext.load("org.apache.phoenix.spark",
>      | Map("table" -> "MARKETDATAHBASE", "zkUrl" -> "rhes564:2181")
>      | )
>
> java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
>   at org.apache.phoenix.spark.PhoenixRDD.getPhoenixConfiguration(
> PhoenixRDD.scala:71)
>   at org.apache.phoenix.spark.PhoenixRDD.phoenixConf$
> lzycompute(PhoenixRDD.scala:39)
>   at org.apache.phoenix.spark.PhoenixRDD.phoenixConf(PhoenixRDD.scala:38)
>   at org.apache.phoenix.spark.PhoenixRDD.<init>(PhoenixRDD.scala:42)
>   at org.apache.phoenix.spark.PhoenixRelation.schema(
> PhoenixRelation.scala:50)
>   at org.apache.spark.sql.execution.datasources.LogicalRelation.<init>(
> LogicalRelation.scala:40)
>   at org.apache.spark.sql.SparkSession.baseRelationToDataFrame(
> SparkSession.scala:382)
>   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:143)
>   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:122)
>   at org.apache.spark.sql.SQLContext.load(SQLContext.scala:958)
>   ... 54 elided
>
> Thanks
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>
>
>
> http://talebzadehmich.wordpress.com
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>