You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Robert James <sr...@gmail.com> on 2016/07/05 06:36:02 UTC

NoClassDefFoundError org/apache/hadoop/hbase/HBaseConfiguration

When trying to load HBase via Spark, I get NoClassDefFoundError
org/apache/hadoop/hbase/HBaseConfiguration errors.

How do I provide that class to Spark?

Re: NoClassDefFoundError org/apache/hadoop/hbase/HBaseConfiguration

Posted by Ted Yu <yu...@gmail.com>.
Robert:

When using `spark-submit`, the application jar along with any jars included
with the `--jars` option
will be automatically transferred to the cluster. URLs supplied after
`--jars` must be separated by commas. That list is included on the driver
and executor classpaths.     Directory expansion does not work with
`--jars`.

For spark.driver.extraClassPath :

Extra classpath entries to prepend to the classpath of the driver.
*Note:* In client mode, this config must not be set through the
SparkConf directly
in your application, because the driver JVM has already started at that
point. Instead, please set this through the --driver-class-path command
line option or in your default properties file.

FYI

On Tue, Jul 5, 2016 at 3:39 PM, Robert James <sr...@gmail.com> wrote:

> I'm using spark-shell.  The perplexing thing is that if I load it via
> spark-shell --jars, it seems to work.  However, if I load it via
> spark.driver.extraClassPath in the config file, it seems to fail.
> What is the difference between --jars (command line) and
> spark.driver.extraClassPath (config)?
>
> On 7/5/16, Dima Spivak <ds...@cloudera.com> wrote:
> > Hey Robert,
> >
> > HBaseConfiguration is part of the hbase-common module of the HBase
> project.
> > Are you using Maven to provide dependencies or just running java -cp?
> >
> > -Dima
> >
> > On Monday, July 4, 2016, Robert James <sr...@gmail.com> wrote:
> >
> >> When trying to load HBase via Spark, I get NoClassDefFoundError
> >> org/apache/hadoop/hbase/HBaseConfiguration errors.
> >>
> >> How do I provide that class to Spark?
> >>
> >
>

Re: NoClassDefFoundError org/apache/hadoop/hbase/HBaseConfiguration

Posted by Dima Spivak <ds...@cloudera.com>.
Hey Robert,

Probably a better question to ask over at user@spark.apache.org.
abase-common,jar would be the artifact you’d wanna put on the class path,
though.

-Dima

On Tue, Jul 5, 2016 at 3:39 PM, Robert James <sr...@gmail.com> wrote:

> I'm using spark-shell.  The perplexing thing is that if I load it via
> spark-shell --jars, it seems to work.  However, if I load it via
> spark.driver.extraClassPath in the config file, it seems to fail.
> What is the difference between --jars (command line) and
> spark.driver.extraClassPath (config)?
>
> On 7/5/16, Dima Spivak <ds...@cloudera.com> wrote:
> > Hey Robert,
> >
> > HBaseConfiguration is part of the hbase-common module of the HBase
> project.
> > Are you using Maven to provide dependencies or just running java -cp?
> >
> > -Dima
> >
> > On Monday, July 4, 2016, Robert James <sr...@gmail.com> wrote:
> >
> >> When trying to load HBase via Spark, I get NoClassDefFoundError
> >> org/apache/hadoop/hbase/HBaseConfiguration errors.
> >>
> >> How do I provide that class to Spark?
> >>
> >
>

Re: NoClassDefFoundError org/apache/hadoop/hbase/HBaseConfiguration

Posted by Robert James <sr...@gmail.com>.
I'm using spark-shell.  The perplexing thing is that if I load it via
spark-shell --jars, it seems to work.  However, if I load it via
spark.driver.extraClassPath in the config file, it seems to fail.
What is the difference between --jars (command line) and
spark.driver.extraClassPath (config)?

On 7/5/16, Dima Spivak <ds...@cloudera.com> wrote:
> Hey Robert,
>
> HBaseConfiguration is part of the hbase-common module of the HBase project.
> Are you using Maven to provide dependencies or just running java -cp?
>
> -Dima
>
> On Monday, July 4, 2016, Robert James <sr...@gmail.com> wrote:
>
>> When trying to load HBase via Spark, I get NoClassDefFoundError
>> org/apache/hadoop/hbase/HBaseConfiguration errors.
>>
>> How do I provide that class to Spark?
>>
>

Re: NoClassDefFoundError org/apache/hadoop/hbase/HBaseConfiguration

Posted by Dima Spivak <ds...@cloudera.com>.
Hey Robert,

HBaseConfiguration is part of the hbase-common module of the HBase project.
Are you using Maven to provide dependencies or just running java -cp?

-Dima

On Monday, July 4, 2016, Robert James <sr...@gmail.com> wrote:

> When trying to load HBase via Spark, I get NoClassDefFoundError
> org/apache/hadoop/hbase/HBaseConfiguration errors.
>
> How do I provide that class to Spark?
>