You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Cheyenne Forbes <ch...@gmail.com> on 2017/05/21 21:08:16 UTC

Where should coprocessor dependencies go when using HDFS?

I added my coprocessor to *"hdfs://9c0def24f75:9000/hbase/lib/" *but I am
getting "*NoClassDefFoundError*" for its dependencies which were also added
to the same HDFS directory.

Regards,

Cheyenne O. Forbes

Re: Where should coprocessor dependencies go when using HDFS?

Posted by Jerry He <je...@gmail.com>.
As Ted said.

HBASE-14548 provides more options, but it is not in HBase 1.2.x.

Thanks.

Jerry

On Sun, May 21, 2017 at 4:28 PM, Ted Yu <yu...@gmail.com> wrote:

> Looks like your code depends on
> https://mvnrepository.com/artifact/org.apache.lucene/lucene-queryparser
>
> Consider packaging dependencies in your coprocessor jar.
>
> Cheers
>
> On Sun, May 21, 2017 at 4:20 PM, Cheyenne Forbes <
> cheyenne.osanu.forbes@gmail.com> wrote:
>
> > 2017-05-21 15:23:58,865 FATAL [RS_OPEN_REGION-9c0def24f75b:16201-0]
> > regionserver.HRegionServer: ABORTING region server
> > 9c0def24f75b,16201,14953945952$
> > java.lang.NoClassDefFoundError:
> > org/apache/lucene/queryparser/classic/ParseException
> >         at java.lang.Class.getDeclaredConstructors0(Native Method)
> >         at
> java.lang.Class.privateGetDeclaredConstructors(Class.java:2671)
> >         at java.lang.Class.getConstructor0(Class.java:3075)
> >         at java.lang.Class.newInstance(Class.java:412)
> >         at
> > org.apache.hadoop.hbase.coprocessor.CoprocessorHost.
> > loadInstance(CoprocessorHost.java:245)
> >         at
> > org.apache.hadoop.hbase.coprocessor.CoprocessorHost.
> > load(CoprocessorHost.java:208)
> >         at
> > org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.
> > loadTableCoprocessors(RegionCoprocessorHost.java:364)
> >         at
> > org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.<init>(
> > RegionCoprocessorHost.java:226)
> >         at
> > org.apache.hadoop.hbase.regionserver.HRegion.<init>(HRegion.java:722)
> >         at
> > org.apache.hadoop.hbase.regionserver.HRegion.<init>(HRegion.java:630)
> >         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> > Method)
> >         at
> > sun.reflect.NativeConstructorAccessorImpl.newInstance(
> > NativeConstructorAccessorImpl.java:62)
> >         at
> > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
> > DelegatingConstructorAccessorImpl.java:45)
> >         at
> java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> >         at
> >
> org.apache.hadoop.hbase.regionserver.HRegion.newHRegion(HRegion.java:6174)
> >         at
> > org.apache.hadoop.hbase.regionserver.HRegion.
> > openHRegion(HRegion.java:6478)
> >         at
> > org.apache.hadoop.hbase.regionserver.HRegion.
> > openHRegion(HRegion.java:6450)
> >         at
> > org.apache.hadoop.hbase.regionserver.HRegion.
> > openHRegion(HRegion.java:6406)
> >         at
> > org.apache.hadoop.hbase.regionserver.HRegion.
> > openHRegion(HRegion.java:6357)
> >         at
> >
> org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler.openRegion(
> > OpenRegionHandler.java:362)
> >         at
> > org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler.process(
> > OpenRegionHandler.java:129)
> >         at
> > org.apache.hadoop.hbase.executor.EventHandler.run(EventHandler.java:129)
> >         at
> > java.util.concurrent.ThreadPoolExecutor.runWorker(
> > ThreadPoolExecutor.java:1142)
> >         at
> > java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > ThreadPoolExecutor.java:617)
> >         at java.lang.Thread.run(Thread.java:745)
> > Caused by: java.lang.ClassNotFoundException:
> > org.apache.lucene.queryparser.classic.ParseException
> >         at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> >         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
> >
> >
> > Regards,
> >
> > On Sun, May 21, 2017 at 4:54 PM, Ted Yu <yu...@gmail.com> wrote:
> >
> > > Can you provide a bit more information:
> > >
> > > stack trace for the NoClassDefFoundError
> > > which jars are placed under hdfs://9c0def24f75:9000/hbase/lib/
> > >
> > > I assume you use 1.2.5 (from your other post).
> > >
> > > Cheers
> > >
> > > On Sun, May 21, 2017 at 2:08 PM, Cheyenne Forbes <
> > > cheyenne.osanu.forbes@gmail.com> wrote:
> > >
> > > > I added my coprocessor to *"hdfs://9c0def24f75:9000/hbase/lib/" *but
> I
> > > am
> > > > getting "*NoClassDefFoundError*" for its dependencies which were also
> > > added
> > > > to the same HDFS directory.
> > > >
> > > > Regards,
> > > >
> > > > Cheyenne O. Forbes
> > > >
> > >
> >
>

Re: Where should coprocessor dependencies go when using HDFS?

Posted by Ted Yu <yu...@gmail.com>.
Looks like your code depends on
https://mvnrepository.com/artifact/org.apache.lucene/lucene-queryparser

Consider packaging dependencies in your coprocessor jar.

Cheers

On Sun, May 21, 2017 at 4:20 PM, Cheyenne Forbes <
cheyenne.osanu.forbes@gmail.com> wrote:

> 2017-05-21 15:23:58,865 FATAL [RS_OPEN_REGION-9c0def24f75b:16201-0]
> regionserver.HRegionServer: ABORTING region server
> 9c0def24f75b,16201,14953945952$
> java.lang.NoClassDefFoundError:
> org/apache/lucene/queryparser/classic/ParseException
>         at java.lang.Class.getDeclaredConstructors0(Native Method)
>         at java.lang.Class.privateGetDeclaredConstructors(Class.java:2671)
>         at java.lang.Class.getConstructor0(Class.java:3075)
>         at java.lang.Class.newInstance(Class.java:412)
>         at
> org.apache.hadoop.hbase.coprocessor.CoprocessorHost.
> loadInstance(CoprocessorHost.java:245)
>         at
> org.apache.hadoop.hbase.coprocessor.CoprocessorHost.
> load(CoprocessorHost.java:208)
>         at
> org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.
> loadTableCoprocessors(RegionCoprocessorHost.java:364)
>         at
> org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.<init>(
> RegionCoprocessorHost.java:226)
>         at
> org.apache.hadoop.hbase.regionserver.HRegion.<init>(HRegion.java:722)
>         at
> org.apache.hadoop.hbase.regionserver.HRegion.<init>(HRegion.java:630)
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>         at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(
> NativeConstructorAccessorImpl.java:62)
>         at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
> DelegatingConstructorAccessorImpl.java:45)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>         at
> org.apache.hadoop.hbase.regionserver.HRegion.newHRegion(HRegion.java:6174)
>         at
> org.apache.hadoop.hbase.regionserver.HRegion.
> openHRegion(HRegion.java:6478)
>         at
> org.apache.hadoop.hbase.regionserver.HRegion.
> openHRegion(HRegion.java:6450)
>         at
> org.apache.hadoop.hbase.regionserver.HRegion.
> openHRegion(HRegion.java:6406)
>         at
> org.apache.hadoop.hbase.regionserver.HRegion.
> openHRegion(HRegion.java:6357)
>         at
> org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler.openRegion(
> OpenRegionHandler.java:362)
>         at
> org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler.process(
> OpenRegionHandler.java:129)
>         at
> org.apache.hadoop.hbase.executor.EventHandler.run(EventHandler.java:129)
>         at
> java.util.concurrent.ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1142)
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:617)
>         at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.lucene.queryparser.classic.ParseException
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>
>
> Regards,
>
> On Sun, May 21, 2017 at 4:54 PM, Ted Yu <yu...@gmail.com> wrote:
>
> > Can you provide a bit more information:
> >
> > stack trace for the NoClassDefFoundError
> > which jars are placed under hdfs://9c0def24f75:9000/hbase/lib/
> >
> > I assume you use 1.2.5 (from your other post).
> >
> > Cheers
> >
> > On Sun, May 21, 2017 at 2:08 PM, Cheyenne Forbes <
> > cheyenne.osanu.forbes@gmail.com> wrote:
> >
> > > I added my coprocessor to *"hdfs://9c0def24f75:9000/hbase/lib/" *but I
> > am
> > > getting "*NoClassDefFoundError*" for its dependencies which were also
> > added
> > > to the same HDFS directory.
> > >
> > > Regards,
> > >
> > > Cheyenne O. Forbes
> > >
> >
>

Re: Where should coprocessor dependencies go when using HDFS?

Posted by Cheyenne Forbes <ch...@gmail.com>.
2017-05-21 15:23:58,865 FATAL [RS_OPEN_REGION-9c0def24f75b:16201-0]
regionserver.HRegionServer: ABORTING region server
9c0def24f75b,16201,14953945952$
java.lang.NoClassDefFoundError:
org/apache/lucene/queryparser/classic/ParseException
        at java.lang.Class.getDeclaredConstructors0(Native Method)
        at java.lang.Class.privateGetDeclaredConstructors(Class.java:2671)
        at java.lang.Class.getConstructor0(Class.java:3075)
        at java.lang.Class.newInstance(Class.java:412)
        at
org.apache.hadoop.hbase.coprocessor.CoprocessorHost.loadInstance(CoprocessorHost.java:245)
        at
org.apache.hadoop.hbase.coprocessor.CoprocessorHost.load(CoprocessorHost.java:208)
        at
org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.loadTableCoprocessors(RegionCoprocessorHost.java:364)
        at
org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.<init>(RegionCoprocessorHost.java:226)
        at
org.apache.hadoop.hbase.regionserver.HRegion.<init>(HRegion.java:722)
        at
org.apache.hadoop.hbase.regionserver.HRegion.<init>(HRegion.java:630)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at
org.apache.hadoop.hbase.regionserver.HRegion.newHRegion(HRegion.java:6174)
        at
org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:6478)
        at
org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:6450)
        at
org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:6406)
        at
org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:6357)
        at
org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler.openRegion(OpenRegionHandler.java:362)
        at
org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler.process(OpenRegionHandler.java:129)
        at
org.apache.hadoop.hbase.executor.EventHandler.run(EventHandler.java:129)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException:
org.apache.lucene.queryparser.classic.ParseException
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)


Regards,

On Sun, May 21, 2017 at 4:54 PM, Ted Yu <yu...@gmail.com> wrote:

> Can you provide a bit more information:
>
> stack trace for the NoClassDefFoundError
> which jars are placed under hdfs://9c0def24f75:9000/hbase/lib/
>
> I assume you use 1.2.5 (from your other post).
>
> Cheers
>
> On Sun, May 21, 2017 at 2:08 PM, Cheyenne Forbes <
> cheyenne.osanu.forbes@gmail.com> wrote:
>
> > I added my coprocessor to *"hdfs://9c0def24f75:9000/hbase/lib/" *but I
> am
> > getting "*NoClassDefFoundError*" for its dependencies which were also
> added
> > to the same HDFS directory.
> >
> > Regards,
> >
> > Cheyenne O. Forbes
> >
>

Re: Where should coprocessor dependencies go when using HDFS?

Posted by Ted Yu <yu...@gmail.com>.
Can you provide a bit more information:

stack trace for the NoClassDefFoundError
which jars are placed under hdfs://9c0def24f75:9000/hbase/lib/

I assume you use 1.2.5 (from your other post).

Cheers

On Sun, May 21, 2017 at 2:08 PM, Cheyenne Forbes <
cheyenne.osanu.forbes@gmail.com> wrote:

> I added my coprocessor to *"hdfs://9c0def24f75:9000/hbase/lib/" *but I am
> getting "*NoClassDefFoundError*" for its dependencies which were also added
> to the same HDFS directory.
>
> Regards,
>
> Cheyenne O. Forbes
>