You are viewing a plain text version of this content. The canonical link for it is here.
Posted to general@hadoop.apache.org by Gary Yang <ga...@yahoo.com> on 2010/03/06 00:47:58 UTC

Compilation failed when compile hadoop common release-0.20.2

Hi,

I try to compile hadoop common of the release 0.20.2. Below are the error messages and java and ant versions I am using. Please tell me what I missed.

......

  [javadoc] Standard Doclet version 1.6.0_18
  [javadoc] Building tree for all the packages and classes...
  [javadoc] Building index for all the packages and classes...
  [javadoc] Building index for all classes...

java5.check:

BUILD FAILED
/hadoop_src/common/build.xml:908: 'java5.home' is not defined.  Forrest requires Java 5.  Please pass -Djava5.home=<base of Java 5 distribution> to Ant on the command-line.


echo $JAVA_HOME
/usr/java/latest

which java
/usr/java/latest/bin/java

/usr/java/latest/bin/java -version
java version "1.6.0_18"
Java(TM) SE Runtime Environment (build 1.6.0_18-b07)
Java HotSpot(TM) Server VM (build 16.0-b13, mixed mode)

ant -version
Apache Ant version 1.7.1 compiled on June 27 2008


Thanks,


Gary





      

Re: Compilation failed when compile hadoop common release-0.20.2

Posted by Ted Yu <yu...@gmail.com>.
Did you first try 'ant jar' from under /hadoop_src ?

On Fri, Mar 5, 2010 at 3:47 PM, Gary Yang <ga...@yahoo.com> wrote:

> Hi,
>
> I try to compile hadoop common of the release 0.20.2. Below are the error
> messages and java and ant versions I am using. Please tell me what I missed.
>
> ......
>
>  [javadoc] Standard Doclet version 1.6.0_18
>  [javadoc] Building tree for all the packages and classes...
>  [javadoc] Building index for all the packages and classes...
>  [javadoc] Building index for all classes...
>
> java5.check:
>
> BUILD FAILED
> /hadoop_src/common/build.xml:908: 'java5.home' is not defined.  Forrest
> requires Java 5.  Please pass -Djava5.home=<base of Java 5 distribution> to
> Ant on the command-line.
>
>
> echo $JAVA_HOME
> /usr/java/latest
>
> which java
> /usr/java/latest/bin/java
>
> /usr/java/latest/bin/java -version
> java version "1.6.0_18"
> Java(TM) SE Runtime Environment (build 1.6.0_18-b07)
> Java HotSpot(TM) Server VM (build 16.0-b13, mixed mode)
>
> ant -version
> Apache Ant version 1.7.1 compiled on June 27 2008
>
>
> Thanks,
>
>
> Gary
>
>
>
>
>
>
>

Re: Compilation failed when compile hadoop common release-0.20.2

Posted by Gary Yang <ga...@yahoo.com>.
Hi Steve,

Thank you very much for this email. I sent an email and reported an compilation error before I got your email. I am going to try your script and let you know.

Thank you again,


Gary


--- On Mon, 3/8/10, Stephen Watt <sw...@us.ibm.com> wrote:

> From: Stephen Watt <sw...@us.ibm.com>
> Subject: Re: Compilation failed when compile hadoop common release-0.20.2
> To: general@hadoop.apache.org
> Date: Monday, March 8, 2010, 12:24 PM
> Hi Gary
> 
> This is a script I put together based on the WikiPage link
> Owen sent you 
> that will build most versions of Hadoop including 0.20.2
> .  You are 
> welcome to use it. Notice how I have JAVA_HOME point to
> java 6 and JAVA5 
> point to java 5 (for Forrest). In this script its pointing
> to IBM Java 
> installations, but you can use a Sun/Oracle JDK if you wish
> to as well. 
> The Wiki Page should be able to answer other details.
> 
> #!/bin/sh
> export VERSION=0.20.2
> set
> PATH=$PATH:/home/hadoop/Java-Versions/ibm-java-i386-60/bin/
> export
> HADOOP_INSTALL=/home/hadoop/Hadoop-Versions/hadoop-$VERSION
> export
> FORREST_INSTALL=/home/hadoop/Test-Dependencies/apache-forrest-0.8
> export
> XERCES_INSTALL=/home/hadoop/Test-Dependencies/xerces-c_2_8_0
> export
> ANT_HOME=/home/hadoop/Test-Dependencies/apache-ant-1.7.1
> export
> JAVA_HOME=/home/hadoop/Java-Versions/ibm-java-i386-60
> export JAVA5=/home/hadoop/Java-Versions/ibm-java2-i386-50
> export CFLAGS=-m32
> export CXXFLAGS=-m32
> export PATH=$PATH:$ANT_HOME/bin
> 
> cd $HADOOP_INSTALL
> 
> # For some reason these scripts do not have execute
> permissions
> chmod 777 src/c++/utils/configure
> chmod 777 src/examples/pipes/configure
> chmod 777 src/native/configure
> 
> # Clean, Build and Run the Core (Non-Contrib) Unit Tests
> ant -Dversion=$VERSION -Dcompile.native=true
> -Dcompile.c++=true 
> -Dlibhdfs=1 -Dlibrecordio=true
> -Dxercescroot=$XERCES_INSTALL 
> -Dforrest.home=$FORREST_INSTALL -Djava5.home=$JAVA5 clean
> tar test-core > 
> /home/hadoop/Test-Scripts/Hadoop-$VERSION/ibm32build.out
> 
> Kind regards
> Steve Watt
> 
> 
> 
> From:
> Gary Yang <ga...@yahoo.com>
> To:
> general@hadoop.apache.org
> Date:
> 03/08/2010 12:07 PM
> Subject:
> Re: Compilation failed when compile hadoop common
> release-0.20.2
> 
> 
> 
> Hi Owen,
> 
> Thanks for the reply. From the link you provided, I found
> the build 
> instruction. I do not understand the option, 
> "-Djava5.home=/usr/local/jdk1.5". Does it mean I have to
> use JDK 1.5? I 
> read somewhere it suggested to use JDK 1.6. 
> 
> Also, the very first line is "export
> JAVA_HOME=/path/to/32bit/jdk
> ". Does it mean the JAVA_HOME jdk has to be 1.5? 
> Please let me know.
> 
> 
> export JAVA_HOME=/path/to/32bit/jdk
> export CFLAGS=-m32
> export CXXFLAGS=-m32
> ant -Dversion=X.Y.Z -Dcompile.native=true
> -Dcompile.c++=true -Dlibhdfs=1 
> -Dlibrecordio=true -Dxercescroot=/usr/local/xerces-c 
> -Declipse.home=/usr/lib/eclipse
> -Dforrest.home=/usr/local/forrest 
> -Djava5.home=/usr/local/jdk1.5 clean api-report tar test
> test-c++-libhdfs
> export JAVA_HOME=/path/to/64bit/jdk
> export CFLAGS=-m64
> export CXXFLAGS=-m64
> ant -Dversion=X.Y.Z -Dcompile.native=true
> -Dcompile.c++=true 
> compile-core-native compile-c++ tar
> 
> 
> Thanks,
> 
> 
> Gary
> 
> 
> 
> --- On Fri, 3/5/10, Owen O'Malley <om...@apache.org>
> wrote:
> 
> > From: Owen O'Malley <om...@apache.org>
> > Subject: Re: Compilation failed when compile hadoop
> common 
> release-0.20.2
> > To: general@hadoop.apache.org
> > Date: Friday, March 5, 2010, 4:32 PM
> > 
> > On Mar 5, 2010, at 3:47 PM, Gary Yang wrote:
> > 
> > > Hi,
> > > 
> > > I try to compile hadoop common of the release
> 0.20.2.
> > Below are the error messages and java and ant versions
> I am
> > using. Please tell me what I missed.
> > 
> > Forrest, which we use to generate the documentation,
> > requires java 5. Therefore, run:
> > 
> > ant -Djava5.home=/some/path/to/java5 tar
> > 
> > There are several more you need. For a more complete
> list,
> > I'd look at the how to release page:
> > 
> > http://wiki.apache.org/hadoop/HowToRelease
> > 
> > -- Owen
> > 
> 
> 
>  
> 
> 
> 


      

Re: Compilation failed when compile hadoop common release-0.20.2

Posted by Gary Yang <ga...@yahoo.com>.
Hi Steve,

I used your script. However, I got errors related to xerces-c-3.1.0.

Below is how I compiled and installed xerces.

./configure --prefix=/path/tools/xerces-c-3.1.0
make
make install



Below are errors related to xerces-c-3.1.0. Do you have any idea?

jar-test:
      [jar] Building jar: /tmp/hadoop-common-0.20.2/build/hadoop-0.20.2-test.jar

ant-tasks:
     [copy] Copying 1 file to /tmp/hadoop-common-0.20.2/build/ant/org/apache/hadoop/ant
      [jar] Building jar: /tmp/hadoop-common-0.20.2/build/hadoop-0.20.2-ant.jar

compile-librecordio:
    [mkdir] Created dir: /tmp/hadoop-common-0.20.2/build/librecordio
     [exec] g++ -g3 -O0 -Wall -c -I/path/tools/xerces-c-3.1.0/include -o /tmp/hadoop-common-0.20.2/build/librecordio/recordio.o recordio.cc
     [exec] In file included from recordio.cc:22:
     [exec] xmlarchive.hh:77: error: conflicting return type specified for `virtual unsigned int hadoop::MyBinInputStream::curPos() const'
     [exec] /path/tools/xerces-c-3.1.0/include/xercesc/util/BinInputStream.hpp:41: error:   overriding `virtual XMLFilePos xercesc_3_1::BinInputStream::curPos() const'
     [exec] xmlarchive.hh: In member function `virtual xercesc_3_1::BinInputStream* hadoop::MyInputSource::makeStream() const':
     [exec] xmlarchive.hh:97: error: cannot allocate an object of type `hadoop::MyBinInputStream'
     [exec] xmlarchive.hh:97: error:   because the following virtual functions are abstract:
     [exec] /path/tools/xerces-c-3.1.0/include/xercesc/util/BinInputStream.hpp:67: error:  virtual const XMLCh* xercesc_3_1::BinInputStream::getContentType() const
     [exec] make: *** [/tmp/hadoop-common-0.20.2/build/librecordio/recordio.o] Error 1

BUILD FAILED
/tmp/hadoop-common-0.20.2/build.xml:1316: exec returned: 2



Thanks,


Gary



--- On Mon, 3/8/10, Stephen Watt <sw...@us.ibm.com> wrote:

> From: Stephen Watt <sw...@us.ibm.com>
> Subject: Re: Compilation failed when compile hadoop common release-0.20.2
> To: general@hadoop.apache.org
> Date: Monday, March 8, 2010, 12:24 PM
> Hi Gary
> 
> This is a script I put together based on the WikiPage link
> Owen sent you 
> that will build most versions of Hadoop including 0.20.2
> .  You are 
> welcome to use it. Notice how I have JAVA_HOME point to
> java 6 and JAVA5 
> point to java 5 (for Forrest). In this script its pointing
> to IBM Java 
> installations, but you can use a Sun/Oracle JDK if you wish
> to as well. 
> The Wiki Page should be able to answer other details.
> 
> #!/bin/sh
> export VERSION=0.20.2
> set
> PATH=$PATH:/home/hadoop/Java-Versions/ibm-java-i386-60/bin/
> export
> HADOOP_INSTALL=/home/hadoop/Hadoop-Versions/hadoop-$VERSION
> export
> FORREST_INSTALL=/home/hadoop/Test-Dependencies/apache-forrest-0.8
> export
> XERCES_INSTALL=/home/hadoop/Test-Dependencies/xerces-c_2_8_0
> export
> ANT_HOME=/home/hadoop/Test-Dependencies/apache-ant-1.7.1
> export
> JAVA_HOME=/home/hadoop/Java-Versions/ibm-java-i386-60
> export JAVA5=/home/hadoop/Java-Versions/ibm-java2-i386-50
> export CFLAGS=-m32
> export CXXFLAGS=-m32
> export PATH=$PATH:$ANT_HOME/bin
> 
> cd $HADOOP_INSTALL
> 
> # For some reason these scripts do not have execute
> permissions
> chmod 777 src/c++/utils/configure
> chmod 777 src/examples/pipes/configure
> chmod 777 src/native/configure
> 
> # Clean, Build and Run the Core (Non-Contrib) Unit Tests
> ant -Dversion=$VERSION -Dcompile.native=true
> -Dcompile.c++=true 
> -Dlibhdfs=1 -Dlibrecordio=true
> -Dxercescroot=$XERCES_INSTALL 
> -Dforrest.home=$FORREST_INSTALL -Djava5.home=$JAVA5 clean
> tar test-core > 
> /home/hadoop/Test-Scripts/Hadoop-$VERSION/ibm32build.out
> 
> Kind regards
> Steve Watt
> 
> 
> 
> From:
> Gary Yang <ga...@yahoo.com>
> To:
> general@hadoop.apache.org
> Date:
> 03/08/2010 12:07 PM
> Subject:
> Re: Compilation failed when compile hadoop common
> release-0.20.2
> 
> 
> 
> Hi Owen,
> 
> Thanks for the reply. From the link you provided, I found
> the build 
> instruction. I do not understand the option, 
> "-Djava5.home=/usr/local/jdk1.5". Does it mean I have to
> use JDK 1.5? I 
> read somewhere it suggested to use JDK 1.6. 
> 
> Also, the very first line is "export
> JAVA_HOME=/path/to/32bit/jdk
> ". Does it mean the JAVA_HOME jdk has to be 1.5? 
> Please let me know.
> 
> 
> export JAVA_HOME=/path/to/32bit/jdk
> export CFLAGS=-m32
> export CXXFLAGS=-m32
> ant -Dversion=X.Y.Z -Dcompile.native=true
> -Dcompile.c++=true -Dlibhdfs=1 
> -Dlibrecordio=true -Dxercescroot=/usr/local/xerces-c 
> -Declipse.home=/usr/lib/eclipse
> -Dforrest.home=/usr/local/forrest 
> -Djava5.home=/usr/local/jdk1.5 clean api-report tar test
> test-c++-libhdfs
> export JAVA_HOME=/path/to/64bit/jdk
> export CFLAGS=-m64
> export CXXFLAGS=-m64
> ant -Dversion=X.Y.Z -Dcompile.native=true
> -Dcompile.c++=true 
> compile-core-native compile-c++ tar
> 
> 
> Thanks,
> 
> 
> Gary
> 
> 
> 
> --- On Fri, 3/5/10, Owen O'Malley <om...@apache.org>
> wrote:
> 
> > From: Owen O'Malley <om...@apache.org>
> > Subject: Re: Compilation failed when compile hadoop
> common 
> release-0.20.2
> > To: general@hadoop.apache.org
> > Date: Friday, March 5, 2010, 4:32 PM
> > 
> > On Mar 5, 2010, at 3:47 PM, Gary Yang wrote:
> > 
> > > Hi,
> > > 
> > > I try to compile hadoop common of the release
> 0.20.2.
> > Below are the error messages and java and ant versions
> I am
> > using. Please tell me what I missed.
> > 
> > Forrest, which we use to generate the documentation,
> > requires java 5. Therefore, run:
> > 
> > ant -Djava5.home=/some/path/to/java5 tar
> > 
> > There are several more you need. For a more complete
> list,
> > I'd look at the how to release page:
> > 
> > http://wiki.apache.org/hadoop/HowToRelease
> > 
> > -- Owen
> > 
> 
> 
>  
> 
> 
> 


      

Re: Compilation failed when compile hadoop common release-0.20.2

Posted by Stephen Watt <sw...@us.ibm.com>.
Hi Gary

This is a script I put together based on the WikiPage link Owen sent you 
that will build most versions of Hadoop including 0.20.2 .  You are 
welcome to use it. Notice how I have JAVA_HOME point to java 6 and JAVA5 
point to java 5 (for Forrest). In this script its pointing to IBM Java 
installations, but you can use a Sun/Oracle JDK if you wish to as well. 
The Wiki Page should be able to answer other details.

#!/bin/sh
export VERSION=0.20.2
set PATH=$PATH:/home/hadoop/Java-Versions/ibm-java-i386-60/bin/
export HADOOP_INSTALL=/home/hadoop/Hadoop-Versions/hadoop-$VERSION
export FORREST_INSTALL=/home/hadoop/Test-Dependencies/apache-forrest-0.8
export XERCES_INSTALL=/home/hadoop/Test-Dependencies/xerces-c_2_8_0
export ANT_HOME=/home/hadoop/Test-Dependencies/apache-ant-1.7.1
export JAVA_HOME=/home/hadoop/Java-Versions/ibm-java-i386-60
export JAVA5=/home/hadoop/Java-Versions/ibm-java2-i386-50
export CFLAGS=-m32
export CXXFLAGS=-m32
export PATH=$PATH:$ANT_HOME/bin

cd $HADOOP_INSTALL

# For some reason these scripts do not have execute permissions
chmod 777 src/c++/utils/configure
chmod 777 src/examples/pipes/configure
chmod 777 src/native/configure

# Clean, Build and Run the Core (Non-Contrib) Unit Tests
ant -Dversion=$VERSION -Dcompile.native=true -Dcompile.c++=true 
-Dlibhdfs=1 -Dlibrecordio=true -Dxercescroot=$XERCES_INSTALL 
-Dforrest.home=$FORREST_INSTALL -Djava5.home=$JAVA5 clean tar test-core > 
/home/hadoop/Test-Scripts/Hadoop-$VERSION/ibm32build.out

Kind regards
Steve Watt



From:
Gary Yang <ga...@yahoo.com>
To:
general@hadoop.apache.org
Date:
03/08/2010 12:07 PM
Subject:
Re: Compilation failed when compile hadoop common release-0.20.2



Hi Owen,

Thanks for the reply. From the link you provided, I found the build 
instruction. I do not understand the option, 
"-Djava5.home=/usr/local/jdk1.5". Does it mean I have to use JDK 1.5? I 
read somewhere it suggested to use JDK 1.6. 

Also, the very first line is "export JAVA_HOME=/path/to/32bit/jdk
". Does it mean the JAVA_HOME jdk has to be 1.5?  Please let me know.


export JAVA_HOME=/path/to/32bit/jdk
export CFLAGS=-m32
export CXXFLAGS=-m32
ant -Dversion=X.Y.Z -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=1 
-Dlibrecordio=true -Dxercescroot=/usr/local/xerces-c 
-Declipse.home=/usr/lib/eclipse -Dforrest.home=/usr/local/forrest 
-Djava5.home=/usr/local/jdk1.5 clean api-report tar test test-c++-libhdfs
export JAVA_HOME=/path/to/64bit/jdk
export CFLAGS=-m64
export CXXFLAGS=-m64
ant -Dversion=X.Y.Z -Dcompile.native=true -Dcompile.c++=true 
compile-core-native compile-c++ tar


Thanks,


Gary



--- On Fri, 3/5/10, Owen O'Malley <om...@apache.org> wrote:

> From: Owen O'Malley <om...@apache.org>
> Subject: Re: Compilation failed when compile hadoop common 
release-0.20.2
> To: general@hadoop.apache.org
> Date: Friday, March 5, 2010, 4:32 PM
> 
> On Mar 5, 2010, at 3:47 PM, Gary Yang wrote:
> 
> > Hi,
> > 
> > I try to compile hadoop common of the release 0.20.2.
> Below are the error messages and java and ant versions I am
> using. Please tell me what I missed.
> 
> Forrest, which we use to generate the documentation,
> requires java 5. Therefore, run:
> 
> ant -Djava5.home=/some/path/to/java5 tar
> 
> There are several more you need. For a more complete list,
> I'd look at the how to release page:
> 
> http://wiki.apache.org/hadoop/HowToRelease
> 
> -- Owen
> 


 



Re: Compilation failed when compile hadoop common release-0.20.2

Posted by Gary Yang <ga...@yahoo.com>.
Hi Allen,

I did not find the java 1.5. However, I found jdk 5.01 EE edition. I think it is java1.5. I downloaded and installed it. I got another compilation error. See below. Any idea?

java_ee_sdk-5_01-linux.bin  (the java I downloaded)

/tmp/jdk1.5/jdk/bin/java -version
java version "1.5.0_09"
Java(TM) 2 Runtime Environment, Standard Edition (build 1.5.0_09-b03)


ls /usr/forrest-0.8
bin/  build/  etc/  index.html  KEYS  lib/  LICENSE.txt  main/  NOTICE.txt  plugins/  README.txt  site-author/  tools/  whiteboard/

ls /tmp/eclipse
about_files/  artifacts.xml   dropins/  eclipse.ini   features/  libcairo-swt.so*  p2/       readme/
about.html    configuration/  eclipse*  epl-v10.html  icon.xpm   notice.html       plugins/



My build scripts:

#!/bin/sh

export JAVA_HOME=/tmp/jdk1.5/jdk
export CFLAGS=-m32
export CXXFLAGS=-m32
ant -Dversion=0.20.2 -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=1 -Dlibrecordio=true -Dxercescroot=/usr/xercesc -Declipse.home=/tmp/eclipse -Dforrest.home=/usr/forrest-0.8 -Djava5.home=/tmp/jdk1.5/jdk clean api-report tar test test-c++-libhdfs



Error Messages:

    [touch] Creating /tmp/null568498220
   [delete] Deleting: /tmp/null568498220
     [copy] Copying 7 files to /tmp/hadoop-common-0.20.2/build/webapps

record-parser:

compile-rcc-compiler:
    [javac] Compiling 29 source files to /tmp/hadoop-common-0.20.2/build/classes
    [javac] javac: invalid target release: 1.6
    [javac] Usage: javac <options> <source files>
    [javac] where possible options include:
    [javac]   -g                         Generate all debugging info
    [javac]   -g:none                    Generate no debugging info
    [javac]   -g:{lines,vars,source}     Generate only some debugging info
    [javac]   -nowarn                    Generate no warnings
    [javac]   -verbose                   Output messages about what the compiler is doing
    [javac]   -deprecation               Output source locations where deprecated APIs are used
    [javac]   -classpath <path>          Specify where to find user class files
    [javac]   -cp <path>                 Specify where to find user class files
    [javac]   -sourcepath <path>         Specify where to find input source files
    [javac]   -bootclasspath <path>      Override location of bootstrap class files
    [javac]   -extdirs <dirs>            Override location of installed extensions
    [javac]   -endorseddirs <dirs>       Override location of endorsed standards path
    [javac]   -d <directory>             Specify where to place generated class files
    [javac]   -encoding <encoding>       Specify character encoding used by source files
    [javac]   -source <release>          Provide source compatibility with specified release
    [javac]   -target <release>          Generate class files for specific VM version
    [javac]   -version                   Version information
    [javac]   -help                      Print a synopsis of standard options
    [javac]   -X                         Print a synopsis of nonstandard options
    [javac]   -J<flag>                   Pass <flag> directly to the runtime system
    [javac] 

BUILD FAILED
/tmp/hadoop-common-0.20.2/build.xml:316: Compile failed; see the compiler error output for details.


I deleted my java 1.6. I do not understand why I got the error below.

    [javac] javac: invalid target release: 1.6
    [javac] Usage: javac <options> <source files>


Thanks,


Gary


--- On Mon, 3/8/10, Allen Wittenauer <aw...@linkedin.com> wrote:

> From: Allen Wittenauer <aw...@linkedin.com>
> Subject: Re: Compilation failed when compile hadoop common release-0.20.2
> To: general@hadoop.apache.org
> Date: Monday, March 8, 2010, 10:30 AM
> 
> 
> 
> On 3/8/10 10:06 AM, "Gary Yang" <ga...@yahoo.com>
> wrote:
> 
> > Hi Owen,
> > 
> > Thanks for the reply. From the link you provided, I
> found the build
> > instruction. I do not understand the option,
> "-Djava5.home=/usr/local/jdk1.5".
> > Does it mean I have to use JDK 1.5? I read somewhere
> it suggested to use JDK
> > 1.6. 
> 
> JDK 1.5 is required to build the documentation. 
> JDK1.6 is used everywhere
> else. So it uses the java5.home setting to kick off forrest
> without
> upsetting the JAVA_HOME env var.
> 
> 
> 


      

Re: Compilation failed when compile hadoop common release-0.20.2

Posted by Allen Wittenauer <aw...@linkedin.com>.


On 3/8/10 10:06 AM, "Gary Yang" <ga...@yahoo.com> wrote:

> Hi Owen,
> 
> Thanks for the reply. From the link you provided, I found the build
> instruction. I do not understand the option, "-Djava5.home=/usr/local/jdk1.5".
> Does it mean I have to use JDK 1.5? I read somewhere it suggested to use JDK
> 1.6. 

JDK 1.5 is required to build the documentation.  JDK1.6 is used everywhere
else. So it uses the java5.home setting to kick off forrest without
upsetting the JAVA_HOME env var.



Re: Compilation failed when compile hadoop common release-0.20.2

Posted by Gary Yang <ga...@yahoo.com>.
Hi Owen,

Thanks for the reply. From the link you provided, I found the build instruction. I do not understand the option, "-Djava5.home=/usr/local/jdk1.5". Does it mean I have to use JDK 1.5? I read somewhere it suggested to use JDK 1.6. 

Also, the very first line is "export JAVA_HOME=/path/to/32bit/jdk
". Does it mean the JAVA_HOME jdk has to be 1.5?  Please let me know.


export JAVA_HOME=/path/to/32bit/jdk
export CFLAGS=-m32
export CXXFLAGS=-m32
ant -Dversion=X.Y.Z -Dcompile.native=true -Dcompile.c++=true -Dlibhdfs=1 -Dlibrecordio=true -Dxercescroot=/usr/local/xerces-c -Declipse.home=/usr/lib/eclipse -Dforrest.home=/usr/local/forrest -Djava5.home=/usr/local/jdk1.5 clean api-report tar test test-c++-libhdfs
export JAVA_HOME=/path/to/64bit/jdk
export CFLAGS=-m64
export CXXFLAGS=-m64
ant -Dversion=X.Y.Z -Dcompile.native=true -Dcompile.c++=true compile-core-native compile-c++ tar


Thanks,


Gary



--- On Fri, 3/5/10, Owen O'Malley <om...@apache.org> wrote:

> From: Owen O'Malley <om...@apache.org>
> Subject: Re: Compilation failed when compile hadoop common release-0.20.2
> To: general@hadoop.apache.org
> Date: Friday, March 5, 2010, 4:32 PM
> 
> On Mar 5, 2010, at 3:47 PM, Gary Yang wrote:
> 
> > Hi,
> > 
> > I try to compile hadoop common of the release 0.20.2.
> Below are the error messages and java and ant versions I am
> using. Please tell me what I missed.
> 
> Forrest, which we use to generate the documentation,
> requires java 5. Therefore, run:
> 
> ant -Djava5.home=/some/path/to/java5 tar
> 
> There are several more you need. For a more complete list,
> I'd look at the how to release page:
> 
> http://wiki.apache.org/hadoop/HowToRelease
> 
> -- Owen
> 


      

Re: Compilation failed when compile hadoop common release-0.20.2

Posted by Owen O'Malley <om...@apache.org>.
On Mar 5, 2010, at 3:47 PM, Gary Yang wrote:

> Hi,
>
> I try to compile hadoop common of the release 0.20.2. Below are the  
> error messages and java and ant versions I am using. Please tell me  
> what I missed.

Forrest, which we use to generate the documentation, requires java 5.  
Therefore, run:

ant -Djava5.home=/some/path/to/java5 tar

There are several more you need. For a more complete list, I'd look at  
the how to release page:

http://wiki.apache.org/hadoop/HowToRelease

-- Owen