You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@phoenix.apache.org by vishnu prasad <vi...@gmail.com> on 2015/11/03 10:50:36 UTC

Phoenix-spark : NoClassDefFoundError: HBaseConfiguration

I'm using phoenix-spark to load data from phoenix-hbase

*spark-version* : 1.5.1
*Hbase Version* : 1.1.2
*Phoenix Version* : phoenix-4.6.0-HBase-1.1

I'm running spark-shell in standalone mode with the following jars added

   - phoenix-spark-4.6.0-HBase-1.1.jar
   - phoenix-4.6.0-HBase-1.1-client.jar
   - phoenix-core-4.6.0-HBase-1.1.jar

spark-1.5.1-bin-hadoop2.6/bin/spark-shell --jars
> /opt/phoenix/phoenix-4.6.0-HBase-1.1-bin/phoenix-spark-4.6.0-HBase-1.1.jar
> /opt/phoenix/phoenix-4.6.0-HBase-1.1-bin/phoenix-4.6.0-HBase-1.1-client.jar
> /opt/phoenix/phoenix-4.6.0-HBase-1.1-bin/phoenix-core-4.6.0-HBase-1.1.jar



import org.apache.spark.sql.SQLContext
> import org.apache.phoenix.spark._
> val sqlcontext = new SQLContext(sc)
> val df = sqlcontext.load("org.apache.phoenix.spark",Map("table" -> "CCPP",
> "zkUrl" -> "localhost:2181"))


Gives me the following stack-strace
warning: there were 1 deprecation warning(s); re-run with -deprecation for
details
java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
at
org.apache.phoenix.spark.PhoenixRDD.getPhoenixConfiguration(PhoenixRDD.scala:71)
at
org.apache.phoenix.spark.PhoenixRDD.phoenixConf$lzycompute(PhoenixRDD.scala:39)
at org.apache.phoenix.spark.PhoenixRDD.phoenixConf(PhoenixRDD.scala:38)
at org.apache.phoenix.spark.PhoenixRDD.<init>(PhoenixRDD.scala:42)
at org.apache.phoenix.spark.PhoenixRelation.schema(PhoenixRelation.scala:50)
at
org.apache.spark.sql.execution.datasources.LogicalRelation.<init>(LogicalRelation.scala:31)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:120)
at org.apache.spark.sql.SQLContext.load(SQLContext.scala:1203)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:24)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31)
at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:33)
at $iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
at $iwC$$iwC$$iwC.<init>(<console>:37)
at $iwC$$iwC.<init>(<console>:39)
at $iwC.<init>(<console>:41)
at <init>(<console>:43)
at .<init>(<console>:47)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at
org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
at org.apache.spark.repl.SparkILoop.org
<http://org.apache.spark.repl.sparkiloop.org/>
$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at
org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at
scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org
<http://org.apache.spark.repl.sparkiloop.org/>
$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException:
org.apache.hadoop.hbase.HBaseConfiguration
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 54 more

Am I Doing anything wrong is there a fix or an alternative way to load data
from hbase-phoenix to spark and write back to phoenix

Thank you
Vishnu

RE: Phoenix-spark : NoClassDefFoundError: HBaseConfiguration

Posted by Josh Mahonin <jm...@interset.com>.
Have you tried setting the SPARK_CLASSPATH or spark.driver.extraClassPath / spark.executor.extraClassPath as specified here?
https://phoenix.apache.org/phoenix_spark.html

Spark treats the JARs passed in with '--jars', and the class path, a little differently.
________________________________
From: vishnu prasad [vishnu667@gmail.com]
Sent: Tuesday, November 03, 2015 4:50 AM
To: user@phoenix.apache.org
Subject: Phoenix-spark : NoClassDefFoundError: HBaseConfiguration

I'm using phoenix-spark to load data from phoenix-hbase

spark-version : 1.5.1
Hbase Version : 1.1.2
Phoenix Version : phoenix-4.6.0-HBase-1.1

I'm running spark-shell in standalone mode with the following jars added

  *   phoenix-spark-4.6.0-HBase-1.1.jar
  *   phoenix-4.6.0-HBase-1.1-client.jar
  *   phoenix-core-4.6.0-HBase-1.1.jar

spark-1.5.1-bin-hadoop2.6/bin/spark-shell --jars /opt/phoenix/phoenix-4.6.0-HBase-1.1-bin/phoenix-spark-4.6.0-HBase-1.1.jar /opt/phoenix/phoenix-4.6.0-HBase-1.1-bin/phoenix-4.6.0-HBase-1.1-client.jar /opt/phoenix/phoenix-4.6.0-HBase-1.1-bin/phoenix-core-4.6.0-HBase-1.1.jar


import org.apache.spark.sql.SQLContext
import org.apache.phoenix.spark._
val sqlcontext = new SQLContext(sc)
val df = sqlcontext.load("org.apache.phoenix.spark",Map("table" -> "CCPP", "zkUrl" -> "localhost:2181"))

Gives me the following stack-strace
warning: there were 1 deprecation warning(s); re-run with -deprecation for details
java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
at org.apache.phoenix.spark.PhoenixRDD.getPhoenixConfiguration(PhoenixRDD.scala:71)
at org.apache.phoenix.spark.PhoenixRDD.phoenixConf$lzycompute(PhoenixRDD.scala:39)
at org.apache.phoenix.spark.PhoenixRDD.phoenixConf(PhoenixRDD.scala:38)
at org.apache.phoenix.spark.PhoenixRDD.<init>(PhoenixRDD.scala:42)
at org.apache.phoenix.spark.PhoenixRelation.schema(PhoenixRelation.scala:50)
at org.apache.spark.sql.execution.datasources.LogicalRelation.<init>(LogicalRelation.scala:31)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:120)
at org.apache.spark.sql.SQLContext.load(SQLContext.scala:1203)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:24)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31)
at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:33)
at $iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
at $iwC$$iwC$$iwC.<init>(<console>:37)
at $iwC$$iwC.<init>(<console>:39)
at $iwC.<init>(<console>:41)
at <init>(<console>:43)
at .<init>(<console>:47)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
at org.apache.spark.repl.SparkILoop.org<http://org.apache.spark.repl.sparkiloop.org/>$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org<http://org.apache.spark.repl.sparkiloop.org/>$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 54 more

Am I Doing anything wrong is there a fix or an alternative way to load data from hbase-phoenix to spark and write back to phoenix

Thank you
Vishnu