You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Asif Jan <As...@unige.ch> on 2010/04/19 10:30:58 UTC
How to make latest build work ?
Hi
I need to build Hadoop installation from the latest source code of
hadoop/common; I checked out the latest source and ran ant target that
makes a distribution tar (ant tar)
when I try to run the system I get error HDFS not found.
any idea how I can have a functional system from latest sources
thanks
Re: How to make latest build work ?
Posted by Steve Loughran <st...@apache.org>.
Asif Jan wrote:
>
> Hi
>
> I need to build Hadoop installation from the latest source code of
> hadoop/common; I checked out the latest source and ran ant target that
> makes a distribution tar (ant tar)
>
> when I try to run the system I get error HDFS not found.
>
> any idea how I can have a functional system from latest sources
>
> thanks
>
>
>
I have stuck some notes up on how I'm building; works even if you aren't
using git
http://wiki.apache.org/hadoop/GitAndHadoop
Re: How to make latest build work ?
Posted by chaitanya krishna <ch...@gmail.com>.
Hi Asif,
Instead of checking out the code of hadoop/commonm, use code of
hadoop/mapreduce.
However, this will not have the necessary executables in bin folder. So, you
can copy the bin folder from source code of hadoop/common.
Once copied, run "ant binary" to create a tarball which can deployed on your
cluster.
Hope this helps.
-V.V.Chaitanya.
On Mon, Apr 19, 2010 at 2:00 PM, Asif Jan <As...@unige.ch> wrote:
>
> Hi
>
> I need to build Hadoop installation from the latest source code of
> hadoop/common; I checked out the latest source and ran ant target that makes
> a distribution tar (ant tar)
>
> when I try to run the system I get error HDFS not found.
>
> any idea how I can have a functional system from latest sources
>
> thanks
>
>
>
>
Re: import multiple jar
Posted by Sonal Goyal <so...@gmail.com>.
Hi,
You can add your dependencies in the lib folder of your main jar. Hadoop
will automatically distribute them to the cluster.
You can also explore using DistributedCache or -libjars options.
Thanks and Regards,
Sonal
www.meghsoft.com
On Mon, Apr 19, 2010 at 7:54 PM, Gang Luo <lg...@yahoo.com.cn> wrote:
> Hi all,
> this is kind of a java problem. I was using a package. In an example
> program, I import the package by "-classpath" when compiling it and pack it
> into a jar. When I execute my jar file, I need to also import the original
> package like this "java -classpath package.jar:myExecutable.jar myClass".
> Otherwise, it will report classnotfound exception. However, when run a
> program in hadoop, I cannot import more than one jar files (bin/hadoop jar
> myExecutable.jar myClass). How to impart that package.jar? I try "export
> CLASSPATH=...", it doesn't help.
>
> Thanks,
> -Gang
>
>
>
>
import multiple jar
Posted by Gang Luo <lg...@yahoo.com.cn>.
Hi all,
this is kind of a java problem. I was using a package. In an example program, I import the package by "-classpath" when compiling it and pack it into a jar. When I execute my jar file, I need to also import the original package like this "java -classpath package.jar:myExecutable.jar myClass". Otherwise, it will report classnotfound exception. However, when run a program in hadoop, I cannot import more than one jar files (bin/hadoop jar myExecutable.jar myClass). How to impart that package.jar? I try "export CLASSPATH=...", it doesn't help.
Thanks,
-Gang