You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Aji Janis <aj...@gmail.com> on 2013/04/04 21:21:11 UTC

Adding external jar

We are running (ancient) hadoop version 0.20.203 and when running the
following command

/opt/hadoop/bin/hadoop jar my.jar full.path.to.my.class.example -libjars "
/opt/../abc.jar, /opt/../xyz.jar"

it keeps throwing the error:
Exception in thread "main" java.lang.NoClassDefFoundError:

the class its complaining about exists is the .jar I am including in
-libjars. Does anyone know (or tell me how I can find) if 0.20.203 supports
-libjars ? or any pointers to how can debug this issue would be great.
Thankyou.

Re: Adding external jar

Posted by Dhruv <dh...@gmail.com>.
Try adding the following line to your hadoop-env.sh file, restart your
cluster and run the job. Notice the : instead of , in this command:

export HADOOP_CLASSPATH=*
/opt/accumulo/lib/libthrift-0.6.1.jar:/opt/accumulo/lib/accumulo-core-1.4.2.jar:/opt/zookeeper/zookeeper-3.3.3.jar:/opt/accumulo/lib/cloudtrace-1.4.2.jar:/opt/accumulo/lib/commons-collections-3.2.jar:/opt/accumulo/lib/commons-configuration-1.5.jar:/opt/accumulo/lib/commons-io-1.4.jar:/opt/accumulo/lib/commons-jci-core-1.0.jar:/opt/accumulo/lib/commons-jci-fam-1.0.jar:/opt/accumulo/lib/commons-lang-2.4.jar:/opt/accumulo/lib/commons-logging-1.0.4.jar:/opt/accumulo/lib/commons-logging-api-1.0.4.jar
*
*
*



On Thu, Apr 4, 2013 at 1:17 PM, Aji Janis <aj...@gmail.com> wrote:

> I am not sure if my classpath is set up right. See:
>
> *[hadoop@node]$ cat /opt/hadoop/conf/hadoop-env.sh | grep HADOOP_CLASSPATH
> *
> export HADOOP_CLASSPATH=./:/conf:/build/*:
>
> and the specific command I was trying to run was actually running a
> mapreduce job on accumulo and was providing some accumulo libraries in the
> -libjars.
>
> *[hadoop@node]$ /opt/hadoop/bin/hadoop jar *
> */opt/accumulo/lib/examples-simple-1.4.2.jar
> org.apache.accumulo.examples.simple.mapreduce.bulk.BulkIngestExample *
> *-libjars *
> *
> "/opt/accumulo/lib/libthrift-0.6.1.jar,/opt/accumulo/lib/accumulo-core-1.4.2.jar,/opt/zookeeper/zookeeper-3.3.3.jar,/opt/accumulo/lib/cloudtrace-1.4.2.jar,/opt/accumulo/lib/commons-collections-3.2.jar,/opt/accumulo/lib/commons-configuration-1.5.jar,/opt/accumulo/lib/commons-io-1.4.jar,/opt/accumulo/lib/commons-jci-core-1.0.jar,/opt/accumulo/lib/commons-jci-fam-1.0.jar,/opt/accumulo/lib/commons-lang-2.4.jar,/opt/accumulo/lib/commons-logging-1.0.4.jar,/opt/accumulo/lib/commons-logging-api-1.0.4.jar"
> *
>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/accumulo/core/client/Instance
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:264)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.accumulo.core.client.Instance
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
>         ... 3 more
>
>
> But org/apache/accumulo/core/client/Instance exists in */opt/accumulo/lib/accumulo-core-1.4.2.jar,
> *any clue whats going wrong here?
>
>
>
> On Thu, Apr 4, 2013 at 3:57 PM, Dhruv <dh...@gmail.com> wrote:
>
>> -libjar should work fine but you need to make sure that your jars are in
>> the Hadoop client's classpath as well.
>>
>> Check HADOOP_CLASSPATH variable and see if it references your jars or not.
>>
>> If it doesn't, you can add "export HADOOP_CLASSPATH=/path/to/your/jar" in
>> your conf/hadoop-env.sh file.
>>
>>
>> On Thu, Apr 4, 2013 at 12:21 PM, Aji Janis <aj...@gmail.com> wrote:
>>
>>> We are running (ancient) hadoop version 0.20.203 and when running the
>>> following command
>>>
>>> /opt/hadoop/bin/hadoop jar my.jar full.path.to.my.class.example -libjars
>>> " /opt/../abc.jar, /opt/../xyz.jar"
>>>
>>> it keeps throwing the error:
>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>>
>>> the class its complaining about exists is the .jar I am including in
>>> -libjars. Does anyone know (or tell me how I can find) if 0.20.203 supports
>>> -libjars ? or any pointers to how can debug this issue would be great.
>>> Thankyou.
>>>
>>
>>
>

Re: Adding external jar

Posted by Dhruv <dh...@gmail.com>.
Try adding the following line to your hadoop-env.sh file, restart your
cluster and run the job. Notice the : instead of , in this command:

export HADOOP_CLASSPATH=*
/opt/accumulo/lib/libthrift-0.6.1.jar:/opt/accumulo/lib/accumulo-core-1.4.2.jar:/opt/zookeeper/zookeeper-3.3.3.jar:/opt/accumulo/lib/cloudtrace-1.4.2.jar:/opt/accumulo/lib/commons-collections-3.2.jar:/opt/accumulo/lib/commons-configuration-1.5.jar:/opt/accumulo/lib/commons-io-1.4.jar:/opt/accumulo/lib/commons-jci-core-1.0.jar:/opt/accumulo/lib/commons-jci-fam-1.0.jar:/opt/accumulo/lib/commons-lang-2.4.jar:/opt/accumulo/lib/commons-logging-1.0.4.jar:/opt/accumulo/lib/commons-logging-api-1.0.4.jar
*
*
*



On Thu, Apr 4, 2013 at 1:17 PM, Aji Janis <aj...@gmail.com> wrote:

> I am not sure if my classpath is set up right. See:
>
> *[hadoop@node]$ cat /opt/hadoop/conf/hadoop-env.sh | grep HADOOP_CLASSPATH
> *
> export HADOOP_CLASSPATH=./:/conf:/build/*:
>
> and the specific command I was trying to run was actually running a
> mapreduce job on accumulo and was providing some accumulo libraries in the
> -libjars.
>
> *[hadoop@node]$ /opt/hadoop/bin/hadoop jar *
> */opt/accumulo/lib/examples-simple-1.4.2.jar
> org.apache.accumulo.examples.simple.mapreduce.bulk.BulkIngestExample *
> *-libjars *
> *
> "/opt/accumulo/lib/libthrift-0.6.1.jar,/opt/accumulo/lib/accumulo-core-1.4.2.jar,/opt/zookeeper/zookeeper-3.3.3.jar,/opt/accumulo/lib/cloudtrace-1.4.2.jar,/opt/accumulo/lib/commons-collections-3.2.jar,/opt/accumulo/lib/commons-configuration-1.5.jar,/opt/accumulo/lib/commons-io-1.4.jar,/opt/accumulo/lib/commons-jci-core-1.0.jar,/opt/accumulo/lib/commons-jci-fam-1.0.jar,/opt/accumulo/lib/commons-lang-2.4.jar,/opt/accumulo/lib/commons-logging-1.0.4.jar,/opt/accumulo/lib/commons-logging-api-1.0.4.jar"
> *
>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/accumulo/core/client/Instance
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:264)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.accumulo.core.client.Instance
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
>         ... 3 more
>
>
> But org/apache/accumulo/core/client/Instance exists in */opt/accumulo/lib/accumulo-core-1.4.2.jar,
> *any clue whats going wrong here?
>
>
>
> On Thu, Apr 4, 2013 at 3:57 PM, Dhruv <dh...@gmail.com> wrote:
>
>> -libjar should work fine but you need to make sure that your jars are in
>> the Hadoop client's classpath as well.
>>
>> Check HADOOP_CLASSPATH variable and see if it references your jars or not.
>>
>> If it doesn't, you can add "export HADOOP_CLASSPATH=/path/to/your/jar" in
>> your conf/hadoop-env.sh file.
>>
>>
>> On Thu, Apr 4, 2013 at 12:21 PM, Aji Janis <aj...@gmail.com> wrote:
>>
>>> We are running (ancient) hadoop version 0.20.203 and when running the
>>> following command
>>>
>>> /opt/hadoop/bin/hadoop jar my.jar full.path.to.my.class.example -libjars
>>> " /opt/../abc.jar, /opt/../xyz.jar"
>>>
>>> it keeps throwing the error:
>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>>
>>> the class its complaining about exists is the .jar I am including in
>>> -libjars. Does anyone know (or tell me how I can find) if 0.20.203 supports
>>> -libjars ? or any pointers to how can debug this issue would be great.
>>> Thankyou.
>>>
>>
>>
>

Re: Adding external jar

Posted by Dhruv <dh...@gmail.com>.
Try adding the following line to your hadoop-env.sh file, restart your
cluster and run the job. Notice the : instead of , in this command:

export HADOOP_CLASSPATH=*
/opt/accumulo/lib/libthrift-0.6.1.jar:/opt/accumulo/lib/accumulo-core-1.4.2.jar:/opt/zookeeper/zookeeper-3.3.3.jar:/opt/accumulo/lib/cloudtrace-1.4.2.jar:/opt/accumulo/lib/commons-collections-3.2.jar:/opt/accumulo/lib/commons-configuration-1.5.jar:/opt/accumulo/lib/commons-io-1.4.jar:/opt/accumulo/lib/commons-jci-core-1.0.jar:/opt/accumulo/lib/commons-jci-fam-1.0.jar:/opt/accumulo/lib/commons-lang-2.4.jar:/opt/accumulo/lib/commons-logging-1.0.4.jar:/opt/accumulo/lib/commons-logging-api-1.0.4.jar
*
*
*



On Thu, Apr 4, 2013 at 1:17 PM, Aji Janis <aj...@gmail.com> wrote:

> I am not sure if my classpath is set up right. See:
>
> *[hadoop@node]$ cat /opt/hadoop/conf/hadoop-env.sh | grep HADOOP_CLASSPATH
> *
> export HADOOP_CLASSPATH=./:/conf:/build/*:
>
> and the specific command I was trying to run was actually running a
> mapreduce job on accumulo and was providing some accumulo libraries in the
> -libjars.
>
> *[hadoop@node]$ /opt/hadoop/bin/hadoop jar *
> */opt/accumulo/lib/examples-simple-1.4.2.jar
> org.apache.accumulo.examples.simple.mapreduce.bulk.BulkIngestExample *
> *-libjars *
> *
> "/opt/accumulo/lib/libthrift-0.6.1.jar,/opt/accumulo/lib/accumulo-core-1.4.2.jar,/opt/zookeeper/zookeeper-3.3.3.jar,/opt/accumulo/lib/cloudtrace-1.4.2.jar,/opt/accumulo/lib/commons-collections-3.2.jar,/opt/accumulo/lib/commons-configuration-1.5.jar,/opt/accumulo/lib/commons-io-1.4.jar,/opt/accumulo/lib/commons-jci-core-1.0.jar,/opt/accumulo/lib/commons-jci-fam-1.0.jar,/opt/accumulo/lib/commons-lang-2.4.jar,/opt/accumulo/lib/commons-logging-1.0.4.jar,/opt/accumulo/lib/commons-logging-api-1.0.4.jar"
> *
>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/accumulo/core/client/Instance
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:264)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.accumulo.core.client.Instance
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
>         ... 3 more
>
>
> But org/apache/accumulo/core/client/Instance exists in */opt/accumulo/lib/accumulo-core-1.4.2.jar,
> *any clue whats going wrong here?
>
>
>
> On Thu, Apr 4, 2013 at 3:57 PM, Dhruv <dh...@gmail.com> wrote:
>
>> -libjar should work fine but you need to make sure that your jars are in
>> the Hadoop client's classpath as well.
>>
>> Check HADOOP_CLASSPATH variable and see if it references your jars or not.
>>
>> If it doesn't, you can add "export HADOOP_CLASSPATH=/path/to/your/jar" in
>> your conf/hadoop-env.sh file.
>>
>>
>> On Thu, Apr 4, 2013 at 12:21 PM, Aji Janis <aj...@gmail.com> wrote:
>>
>>> We are running (ancient) hadoop version 0.20.203 and when running the
>>> following command
>>>
>>> /opt/hadoop/bin/hadoop jar my.jar full.path.to.my.class.example -libjars
>>> " /opt/../abc.jar, /opt/../xyz.jar"
>>>
>>> it keeps throwing the error:
>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>>
>>> the class its complaining about exists is the .jar I am including in
>>> -libjars. Does anyone know (or tell me how I can find) if 0.20.203 supports
>>> -libjars ? or any pointers to how can debug this issue would be great.
>>> Thankyou.
>>>
>>
>>
>

Re: Adding external jar

Posted by Dhruv <dh...@gmail.com>.
Try adding the following line to your hadoop-env.sh file, restart your
cluster and run the job. Notice the : instead of , in this command:

export HADOOP_CLASSPATH=*
/opt/accumulo/lib/libthrift-0.6.1.jar:/opt/accumulo/lib/accumulo-core-1.4.2.jar:/opt/zookeeper/zookeeper-3.3.3.jar:/opt/accumulo/lib/cloudtrace-1.4.2.jar:/opt/accumulo/lib/commons-collections-3.2.jar:/opt/accumulo/lib/commons-configuration-1.5.jar:/opt/accumulo/lib/commons-io-1.4.jar:/opt/accumulo/lib/commons-jci-core-1.0.jar:/opt/accumulo/lib/commons-jci-fam-1.0.jar:/opt/accumulo/lib/commons-lang-2.4.jar:/opt/accumulo/lib/commons-logging-1.0.4.jar:/opt/accumulo/lib/commons-logging-api-1.0.4.jar
*
*
*



On Thu, Apr 4, 2013 at 1:17 PM, Aji Janis <aj...@gmail.com> wrote:

> I am not sure if my classpath is set up right. See:
>
> *[hadoop@node]$ cat /opt/hadoop/conf/hadoop-env.sh | grep HADOOP_CLASSPATH
> *
> export HADOOP_CLASSPATH=./:/conf:/build/*:
>
> and the specific command I was trying to run was actually running a
> mapreduce job on accumulo and was providing some accumulo libraries in the
> -libjars.
>
> *[hadoop@node]$ /opt/hadoop/bin/hadoop jar *
> */opt/accumulo/lib/examples-simple-1.4.2.jar
> org.apache.accumulo.examples.simple.mapreduce.bulk.BulkIngestExample *
> *-libjars *
> *
> "/opt/accumulo/lib/libthrift-0.6.1.jar,/opt/accumulo/lib/accumulo-core-1.4.2.jar,/opt/zookeeper/zookeeper-3.3.3.jar,/opt/accumulo/lib/cloudtrace-1.4.2.jar,/opt/accumulo/lib/commons-collections-3.2.jar,/opt/accumulo/lib/commons-configuration-1.5.jar,/opt/accumulo/lib/commons-io-1.4.jar,/opt/accumulo/lib/commons-jci-core-1.0.jar,/opt/accumulo/lib/commons-jci-fam-1.0.jar,/opt/accumulo/lib/commons-lang-2.4.jar,/opt/accumulo/lib/commons-logging-1.0.4.jar,/opt/accumulo/lib/commons-logging-api-1.0.4.jar"
> *
>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/accumulo/core/client/Instance
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:264)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.accumulo.core.client.Instance
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
>         ... 3 more
>
>
> But org/apache/accumulo/core/client/Instance exists in */opt/accumulo/lib/accumulo-core-1.4.2.jar,
> *any clue whats going wrong here?
>
>
>
> On Thu, Apr 4, 2013 at 3:57 PM, Dhruv <dh...@gmail.com> wrote:
>
>> -libjar should work fine but you need to make sure that your jars are in
>> the Hadoop client's classpath as well.
>>
>> Check HADOOP_CLASSPATH variable and see if it references your jars or not.
>>
>> If it doesn't, you can add "export HADOOP_CLASSPATH=/path/to/your/jar" in
>> your conf/hadoop-env.sh file.
>>
>>
>> On Thu, Apr 4, 2013 at 12:21 PM, Aji Janis <aj...@gmail.com> wrote:
>>
>>> We are running (ancient) hadoop version 0.20.203 and when running the
>>> following command
>>>
>>> /opt/hadoop/bin/hadoop jar my.jar full.path.to.my.class.example -libjars
>>> " /opt/../abc.jar, /opt/../xyz.jar"
>>>
>>> it keeps throwing the error:
>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>>
>>> the class its complaining about exists is the .jar I am including in
>>> -libjars. Does anyone know (or tell me how I can find) if 0.20.203 supports
>>> -libjars ? or any pointers to how can debug this issue would be great.
>>> Thankyou.
>>>
>>
>>
>

Re: Adding external jar

Posted by Aji Janis <aj...@gmail.com>.
I am not sure if my classpath is set up right. See:

*[hadoop@node]$ cat /opt/hadoop/conf/hadoop-env.sh | grep HADOOP_CLASSPATH*
export HADOOP_CLASSPATH=./:/conf:/build/*:

and the specific command I was trying to run was actually running a
mapreduce job on accumulo and was providing some accumulo libraries in the
-libjars.

*[hadoop@node]$ /opt/hadoop/bin/hadoop jar *
*/opt/accumulo/lib/examples-simple-1.4.2.jar
org.apache.accumulo.examples.simple.mapreduce.bulk.BulkIngestExample *
*-libjars *
*
"/opt/accumulo/lib/libthrift-0.6.1.jar,/opt/accumulo/lib/accumulo-core-1.4.2.jar,/opt/zookeeper/zookeeper-3.3.3.jar,/opt/accumulo/lib/cloudtrace-1.4.2.jar,/opt/accumulo/lib/commons-collections-3.2.jar,/opt/accumulo/lib/commons-configuration-1.5.jar,/opt/accumulo/lib/commons-io-1.4.jar,/opt/accumulo/lib/commons-jci-core-1.0.jar,/opt/accumulo/lib/commons-jci-fam-1.0.jar,/opt/accumulo/lib/commons-lang-2.4.jar,/opt/accumulo/lib/commons-logging-1.0.4.jar,/opt/accumulo/lib/commons-logging-api-1.0.4.jar"
*

Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/accumulo/core/client/Instance
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:264)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
Caused by: java.lang.ClassNotFoundException:
org.apache.accumulo.core.client.Instance
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
        ... 3 more


But org/apache/accumulo/core/client/Instance exists in
*/opt/accumulo/lib/accumulo-core-1.4.2.jar,
*any clue whats going wrong here?



On Thu, Apr 4, 2013 at 3:57 PM, Dhruv <dh...@gmail.com> wrote:

> -libjar should work fine but you need to make sure that your jars are in
> the Hadoop client's classpath as well.
>
> Check HADOOP_CLASSPATH variable and see if it references your jars or not.
>
> If it doesn't, you can add "export HADOOP_CLASSPATH=/path/to/your/jar" in
> your conf/hadoop-env.sh file.
>
>
> On Thu, Apr 4, 2013 at 12:21 PM, Aji Janis <aj...@gmail.com> wrote:
>
>> We are running (ancient) hadoop version 0.20.203 and when running the
>> following command
>>
>> /opt/hadoop/bin/hadoop jar my.jar full.path.to.my.class.example -libjars
>> " /opt/../abc.jar, /opt/../xyz.jar"
>>
>> it keeps throwing the error:
>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>
>> the class its complaining about exists is the .jar I am including in
>> -libjars. Does anyone know (or tell me how I can find) if 0.20.203 supports
>> -libjars ? or any pointers to how can debug this issue would be great.
>> Thankyou.
>>
>
>

Re: Adding external jar

Posted by Aji Janis <aj...@gmail.com>.
I am not sure if my classpath is set up right. See:

*[hadoop@node]$ cat /opt/hadoop/conf/hadoop-env.sh | grep HADOOP_CLASSPATH*
export HADOOP_CLASSPATH=./:/conf:/build/*:

and the specific command I was trying to run was actually running a
mapreduce job on accumulo and was providing some accumulo libraries in the
-libjars.

*[hadoop@node]$ /opt/hadoop/bin/hadoop jar *
*/opt/accumulo/lib/examples-simple-1.4.2.jar
org.apache.accumulo.examples.simple.mapreduce.bulk.BulkIngestExample *
*-libjars *
*
"/opt/accumulo/lib/libthrift-0.6.1.jar,/opt/accumulo/lib/accumulo-core-1.4.2.jar,/opt/zookeeper/zookeeper-3.3.3.jar,/opt/accumulo/lib/cloudtrace-1.4.2.jar,/opt/accumulo/lib/commons-collections-3.2.jar,/opt/accumulo/lib/commons-configuration-1.5.jar,/opt/accumulo/lib/commons-io-1.4.jar,/opt/accumulo/lib/commons-jci-core-1.0.jar,/opt/accumulo/lib/commons-jci-fam-1.0.jar,/opt/accumulo/lib/commons-lang-2.4.jar,/opt/accumulo/lib/commons-logging-1.0.4.jar,/opt/accumulo/lib/commons-logging-api-1.0.4.jar"
*

Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/accumulo/core/client/Instance
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:264)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
Caused by: java.lang.ClassNotFoundException:
org.apache.accumulo.core.client.Instance
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
        ... 3 more


But org/apache/accumulo/core/client/Instance exists in
*/opt/accumulo/lib/accumulo-core-1.4.2.jar,
*any clue whats going wrong here?



On Thu, Apr 4, 2013 at 3:57 PM, Dhruv <dh...@gmail.com> wrote:

> -libjar should work fine but you need to make sure that your jars are in
> the Hadoop client's classpath as well.
>
> Check HADOOP_CLASSPATH variable and see if it references your jars or not.
>
> If it doesn't, you can add "export HADOOP_CLASSPATH=/path/to/your/jar" in
> your conf/hadoop-env.sh file.
>
>
> On Thu, Apr 4, 2013 at 12:21 PM, Aji Janis <aj...@gmail.com> wrote:
>
>> We are running (ancient) hadoop version 0.20.203 and when running the
>> following command
>>
>> /opt/hadoop/bin/hadoop jar my.jar full.path.to.my.class.example -libjars
>> " /opt/../abc.jar, /opt/../xyz.jar"
>>
>> it keeps throwing the error:
>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>
>> the class its complaining about exists is the .jar I am including in
>> -libjars. Does anyone know (or tell me how I can find) if 0.20.203 supports
>> -libjars ? or any pointers to how can debug this issue would be great.
>> Thankyou.
>>
>
>

Re: Adding external jar

Posted by Aji Janis <aj...@gmail.com>.
I am not sure if my classpath is set up right. See:

*[hadoop@node]$ cat /opt/hadoop/conf/hadoop-env.sh | grep HADOOP_CLASSPATH*
export HADOOP_CLASSPATH=./:/conf:/build/*:

and the specific command I was trying to run was actually running a
mapreduce job on accumulo and was providing some accumulo libraries in the
-libjars.

*[hadoop@node]$ /opt/hadoop/bin/hadoop jar *
*/opt/accumulo/lib/examples-simple-1.4.2.jar
org.apache.accumulo.examples.simple.mapreduce.bulk.BulkIngestExample *
*-libjars *
*
"/opt/accumulo/lib/libthrift-0.6.1.jar,/opt/accumulo/lib/accumulo-core-1.4.2.jar,/opt/zookeeper/zookeeper-3.3.3.jar,/opt/accumulo/lib/cloudtrace-1.4.2.jar,/opt/accumulo/lib/commons-collections-3.2.jar,/opt/accumulo/lib/commons-configuration-1.5.jar,/opt/accumulo/lib/commons-io-1.4.jar,/opt/accumulo/lib/commons-jci-core-1.0.jar,/opt/accumulo/lib/commons-jci-fam-1.0.jar,/opt/accumulo/lib/commons-lang-2.4.jar,/opt/accumulo/lib/commons-logging-1.0.4.jar,/opt/accumulo/lib/commons-logging-api-1.0.4.jar"
*

Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/accumulo/core/client/Instance
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:264)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
Caused by: java.lang.ClassNotFoundException:
org.apache.accumulo.core.client.Instance
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
        ... 3 more


But org/apache/accumulo/core/client/Instance exists in
*/opt/accumulo/lib/accumulo-core-1.4.2.jar,
*any clue whats going wrong here?



On Thu, Apr 4, 2013 at 3:57 PM, Dhruv <dh...@gmail.com> wrote:

> -libjar should work fine but you need to make sure that your jars are in
> the Hadoop client's classpath as well.
>
> Check HADOOP_CLASSPATH variable and see if it references your jars or not.
>
> If it doesn't, you can add "export HADOOP_CLASSPATH=/path/to/your/jar" in
> your conf/hadoop-env.sh file.
>
>
> On Thu, Apr 4, 2013 at 12:21 PM, Aji Janis <aj...@gmail.com> wrote:
>
>> We are running (ancient) hadoop version 0.20.203 and when running the
>> following command
>>
>> /opt/hadoop/bin/hadoop jar my.jar full.path.to.my.class.example -libjars
>> " /opt/../abc.jar, /opt/../xyz.jar"
>>
>> it keeps throwing the error:
>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>
>> the class its complaining about exists is the .jar I am including in
>> -libjars. Does anyone know (or tell me how I can find) if 0.20.203 supports
>> -libjars ? or any pointers to how can debug this issue would be great.
>> Thankyou.
>>
>
>

Re: Adding external jar

Posted by Aji Janis <aj...@gmail.com>.
I am not sure if my classpath is set up right. See:

*[hadoop@node]$ cat /opt/hadoop/conf/hadoop-env.sh | grep HADOOP_CLASSPATH*
export HADOOP_CLASSPATH=./:/conf:/build/*:

and the specific command I was trying to run was actually running a
mapreduce job on accumulo and was providing some accumulo libraries in the
-libjars.

*[hadoop@node]$ /opt/hadoop/bin/hadoop jar *
*/opt/accumulo/lib/examples-simple-1.4.2.jar
org.apache.accumulo.examples.simple.mapreduce.bulk.BulkIngestExample *
*-libjars *
*
"/opt/accumulo/lib/libthrift-0.6.1.jar,/opt/accumulo/lib/accumulo-core-1.4.2.jar,/opt/zookeeper/zookeeper-3.3.3.jar,/opt/accumulo/lib/cloudtrace-1.4.2.jar,/opt/accumulo/lib/commons-collections-3.2.jar,/opt/accumulo/lib/commons-configuration-1.5.jar,/opt/accumulo/lib/commons-io-1.4.jar,/opt/accumulo/lib/commons-jci-core-1.0.jar,/opt/accumulo/lib/commons-jci-fam-1.0.jar,/opt/accumulo/lib/commons-lang-2.4.jar,/opt/accumulo/lib/commons-logging-1.0.4.jar,/opt/accumulo/lib/commons-logging-api-1.0.4.jar"
*

Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/accumulo/core/client/Instance
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:264)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
Caused by: java.lang.ClassNotFoundException:
org.apache.accumulo.core.client.Instance
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
        ... 3 more


But org/apache/accumulo/core/client/Instance exists in
*/opt/accumulo/lib/accumulo-core-1.4.2.jar,
*any clue whats going wrong here?



On Thu, Apr 4, 2013 at 3:57 PM, Dhruv <dh...@gmail.com> wrote:

> -libjar should work fine but you need to make sure that your jars are in
> the Hadoop client's classpath as well.
>
> Check HADOOP_CLASSPATH variable and see if it references your jars or not.
>
> If it doesn't, you can add "export HADOOP_CLASSPATH=/path/to/your/jar" in
> your conf/hadoop-env.sh file.
>
>
> On Thu, Apr 4, 2013 at 12:21 PM, Aji Janis <aj...@gmail.com> wrote:
>
>> We are running (ancient) hadoop version 0.20.203 and when running the
>> following command
>>
>> /opt/hadoop/bin/hadoop jar my.jar full.path.to.my.class.example -libjars
>> " /opt/../abc.jar, /opt/../xyz.jar"
>>
>> it keeps throwing the error:
>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>
>> the class its complaining about exists is the .jar I am including in
>> -libjars. Does anyone know (or tell me how I can find) if 0.20.203 supports
>> -libjars ? or any pointers to how can debug this issue would be great.
>> Thankyou.
>>
>
>

Re: Adding external jar

Posted by Dhruv <dh...@gmail.com>.
-libjar should work fine but you need to make sure that your jars are in
the Hadoop client's classpath as well.

Check HADOOP_CLASSPATH variable and see if it references your jars or not.

If it doesn't, you can add "export HADOOP_CLASSPATH=/path/to/your/jar" in
your conf/hadoop-env.sh file.


On Thu, Apr 4, 2013 at 12:21 PM, Aji Janis <aj...@gmail.com> wrote:

> We are running (ancient) hadoop version 0.20.203 and when running the
> following command
>
> /opt/hadoop/bin/hadoop jar my.jar full.path.to.my.class.example -libjars "
> /opt/../abc.jar, /opt/../xyz.jar"
>
> it keeps throwing the error:
> Exception in thread "main" java.lang.NoClassDefFoundError:
>
> the class its complaining about exists is the .jar I am including in
> -libjars. Does anyone know (or tell me how I can find) if 0.20.203 supports
> -libjars ? or any pointers to how can debug this issue would be great.
> Thankyou.
>

Re: Adding external jar

Posted by Dhruv <dh...@gmail.com>.
-libjar should work fine but you need to make sure that your jars are in
the Hadoop client's classpath as well.

Check HADOOP_CLASSPATH variable and see if it references your jars or not.

If it doesn't, you can add "export HADOOP_CLASSPATH=/path/to/your/jar" in
your conf/hadoop-env.sh file.


On Thu, Apr 4, 2013 at 12:21 PM, Aji Janis <aj...@gmail.com> wrote:

> We are running (ancient) hadoop version 0.20.203 and when running the
> following command
>
> /opt/hadoop/bin/hadoop jar my.jar full.path.to.my.class.example -libjars "
> /opt/../abc.jar, /opt/../xyz.jar"
>
> it keeps throwing the error:
> Exception in thread "main" java.lang.NoClassDefFoundError:
>
> the class its complaining about exists is the .jar I am including in
> -libjars. Does anyone know (or tell me how I can find) if 0.20.203 supports
> -libjars ? or any pointers to how can debug this issue would be great.
> Thankyou.
>

Re: Adding external jar

Posted by Dhruv <dh...@gmail.com>.
-libjar should work fine but you need to make sure that your jars are in
the Hadoop client's classpath as well.

Check HADOOP_CLASSPATH variable and see if it references your jars or not.

If it doesn't, you can add "export HADOOP_CLASSPATH=/path/to/your/jar" in
your conf/hadoop-env.sh file.


On Thu, Apr 4, 2013 at 12:21 PM, Aji Janis <aj...@gmail.com> wrote:

> We are running (ancient) hadoop version 0.20.203 and when running the
> following command
>
> /opt/hadoop/bin/hadoop jar my.jar full.path.to.my.class.example -libjars "
> /opt/../abc.jar, /opt/../xyz.jar"
>
> it keeps throwing the error:
> Exception in thread "main" java.lang.NoClassDefFoundError:
>
> the class its complaining about exists is the .jar I am including in
> -libjars. Does anyone know (or tell me how I can find) if 0.20.203 supports
> -libjars ? or any pointers to how can debug this issue would be great.
> Thankyou.
>

Re: Adding external jar

Posted by Dhruv <dh...@gmail.com>.
-libjar should work fine but you need to make sure that your jars are in
the Hadoop client's classpath as well.

Check HADOOP_CLASSPATH variable and see if it references your jars or not.

If it doesn't, you can add "export HADOOP_CLASSPATH=/path/to/your/jar" in
your conf/hadoop-env.sh file.


On Thu, Apr 4, 2013 at 12:21 PM, Aji Janis <aj...@gmail.com> wrote:

> We are running (ancient) hadoop version 0.20.203 and when running the
> following command
>
> /opt/hadoop/bin/hadoop jar my.jar full.path.to.my.class.example -libjars "
> /opt/../abc.jar, /opt/../xyz.jar"
>
> it keeps throwing the error:
> Exception in thread "main" java.lang.NoClassDefFoundError:
>
> the class its complaining about exists is the .jar I am including in
> -libjars. Does anyone know (or tell me how I can find) if 0.20.203 supports
> -libjars ? or any pointers to how can debug this issue would be great.
> Thankyou.
>