You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Gang Luo <lg...@yahoo.com.cn> on 2010/04/19 16:24:05 UTC

import multiple jar

Hi all,
this is kind of a java problem. I was using a package. In an example program, I import the package by "-classpath" when compiling it and pack it into a jar. When I execute my jar file, I need to also import the original package like this "java -classpath package.jar:myExecutable.jar myClass". Otherwise, it will report classnotfound exception. However, when run a program in hadoop, I cannot import more than one jar files (bin/hadoop jar myExecutable.jar myClass). How to impart that package.jar? I try "export CLASSPATH=...", it doesn't help.

Thanks,
-Gang


      

Re: import multiple jar

Posted by Sonal Goyal <so...@gmail.com>.
Hi,

You can add your dependencies in the lib folder of your main jar. Hadoop
will automatically distribute them to the cluster.

You can also explore using DistributedCache or -libjars options.
Thanks and Regards,
Sonal
www.meghsoft.com


On Mon, Apr 19, 2010 at 7:54 PM, Gang Luo <lg...@yahoo.com.cn> wrote:

> Hi all,
> this is kind of a java problem. I was using a package. In an example
> program, I import the package by "-classpath" when compiling it and pack it
> into a jar. When I execute my jar file, I need to also import the original
> package like this "java -classpath package.jar:myExecutable.jar myClass".
> Otherwise, it will report classnotfound exception. However, when run a
> program in hadoop, I cannot import more than one jar files (bin/hadoop jar
> myExecutable.jar myClass). How to impart that package.jar? I try "export
> CLASSPATH=...", it doesn't help.
>
> Thanks,
> -Gang
>
>
>
>