You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Mike Spreitzer <ms...@us.ibm.com> on 2011/02/14 04:48:55 UTC

Building Hadoop for HBase

Yes, I think I could use some clues.  The HBase instructions send me to (
http://wiki.apache.org/hadoop/HowToRelease) and I looked at (
http://wiki.apache.org/hadoop/HowToRelease#Building), which is daunting 
and confusing (I thought I was supposed to build with Java 6, not 5).  I 
went looking for instructions for how to build for your own purposes 
rather than making a release; I found (
http://wiki.apache.org/hadoop/HowToContribute).  I fetch 
branch-0.20-append from SVN and then do a build, with this command (copied 
from HowToContribute):
ant -Djavac.args="-Xlint -Xmaxwarns 1000" clean test tar

The actual build part produces no complaints, but some of the tests have 
problems --- over the course of 3 hours.  The typescript ends like this:
BUILD FAILED
/root/apachedev/hadoop-common/build.xml:817: The following error occurred 
while executing this line:
/root/apachedev/hadoop-common/build.xml:806: The following error occurred 
while executing this line:
/root/apachedev/hadoop-common/src/contrib/build.xml:48: The following 
error occurred while executing this line:
/root/apachedev/hadoop-common/src/contrib/streaming/build.xml:40: The 
following error occurred while executing this line:
/root/apachedev/hadoop-common/src/contrib/build-contrib.xml:245: Tests 
failed!

Total time: 184 minutes 11 seconds

The following searches through my typescript show the failures and errors.

# egrep -C 1 "Failures: [^0]" typescript.txt
    [junit] Running org.apache.hadoop.cli.TestCLI
    [junit] Tests run: 1, Failures: 1, Errors: 0, Time elapsed: 32.243 sec
    [junit] Test org.apache.hadoop.cli.TestCLI FAILED
--
    [junit] Running org.apache.hadoop.fs.TestLocalDirAllocator
    [junit] Tests run: 5, Failures: 3, Errors: 0, Time elapsed: 0.493 sec
    [junit] Test org.apache.hadoop.fs.TestLocalDirAllocator FAILED
--
    [junit] Running 
org.apache.hadoop.mapred.lib.TestCombineFileInputFormat
    [junit] Tests run: 1, Failures: 1, Errors: 0, Time elapsed: 5.061 sec
    [junit] Test org.apache.hadoop.mapred.lib.TestCombineFileInputFormat 
FAILED

# egrep -C 1 "Errors: [^0]" typescript.txt
    [junit] Running org.apache.hadoop.hdfs.TestDistributedFileSystem
    [junit] Tests run: 4, Failures: 0, Errors: 1, Time elapsed: 9.057 sec
    [junit] Test org.apache.hadoop.hdfs.TestDistributedFileSystem FAILED
--
    [junit] Running org.apache.hadoop.hdfs.TestFileAppend4
    [junit] Tests run: 18, Failures: 0, Errors: 1, Time elapsed: 315.85 
sec
    [junit] Test org.apache.hadoop.hdfs.TestFileAppend4 FAILED
--
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestDiskError
    [junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 0 sec
    [junit] Test org.apache.hadoop.hdfs.server.datanode.TestDiskError 
FAILED (timeout)
--
    [junit] Running org.apache.hadoop.streaming.TestStreamingBadRecords
    [junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 0 sec
    [junit] Test org.apache.hadoop.streaming.TestStreamingBadRecords 
FAILED (timeout)

Also, I am puzzled by the fact that HowToContribute does not mention 
building the native library but HowToRelease does.  (
http://hadoop.apache.org/common/docs/current/native_libraries.html) says "
Hadoop has native implementations of certain components for performance 
reasons and for non-availability of Java implementations" --- which 
suggests to me that the native library is not optional (at least for 
mapreduce, which is the client mentioned in that web page).  What is going 
on here?

Thanks,
Mike Spreitzer

Re: Building Hadoop for HBase

Posted by Ryan Rawson <ry...@gmail.com>.
I usually dont bother running the hadoop test suite, its a rare day
when a complex project has a stable bulletproof test suite where every
error means real problems, not just problems with the test
environment.  Eg: all the hadoop tests might not pass on anything but
linux for example. Having worked on many huge projects this is a major
problem that hasnt ever really been adequately addressed except by
very strict control over the test environment (to and beyond the point
of what most people do to their "prod" environment.)

So, just ant ... tar, skip the test :-)

-ryan

On Sun, Feb 13, 2011 at 7:48 PM, Mike Spreitzer <ms...@us.ibm.com> wrote:
> Yes, I think I could use some clues.  The HBase instructions send me to (
> http://wiki.apache.org/hadoop/HowToRelease) and I looked at (
> http://wiki.apache.org/hadoop/HowToRelease#Building), which is daunting
> and confusing (I thought I was supposed to build with Java 6, not 5).  I
> went looking for instructions for how to build for your own purposes
> rather than making a release; I found (
> http://wiki.apache.org/hadoop/HowToContribute).  I fetch
> branch-0.20-append from SVN and then do a build, with this command (copied
> from HowToContribute):
> ant -Djavac.args="-Xlint -Xmaxwarns 1000" clean test tar
>
> The actual build part produces no complaints, but some of the tests have
> problems --- over the course of 3 hours.  The typescript ends like this:
> BUILD FAILED
> /root/apachedev/hadoop-common/build.xml:817: The following error occurred
> while executing this line:
> /root/apachedev/hadoop-common/build.xml:806: The following error occurred
> while executing this line:
> /root/apachedev/hadoop-common/src/contrib/build.xml:48: The following
> error occurred while executing this line:
> /root/apachedev/hadoop-common/src/contrib/streaming/build.xml:40: The
> following error occurred while executing this line:
> /root/apachedev/hadoop-common/src/contrib/build-contrib.xml:245: Tests
> failed!
>
> Total time: 184 minutes 11 seconds
>
> The following searches through my typescript show the failures and errors.
>
> # egrep -C 1 "Failures: [^0]" typescript.txt
>    [junit] Running org.apache.hadoop.cli.TestCLI
>    [junit] Tests run: 1, Failures: 1, Errors: 0, Time elapsed: 32.243 sec
>    [junit] Test org.apache.hadoop.cli.TestCLI FAILED
> --
>    [junit] Running org.apache.hadoop.fs.TestLocalDirAllocator
>    [junit] Tests run: 5, Failures: 3, Errors: 0, Time elapsed: 0.493 sec
>    [junit] Test org.apache.hadoop.fs.TestLocalDirAllocator FAILED
> --
>    [junit] Running
> org.apache.hadoop.mapred.lib.TestCombineFileInputFormat
>    [junit] Tests run: 1, Failures: 1, Errors: 0, Time elapsed: 5.061 sec
>    [junit] Test org.apache.hadoop.mapred.lib.TestCombineFileInputFormat
> FAILED
>
> # egrep -C 1 "Errors: [^0]" typescript.txt
>    [junit] Running org.apache.hadoop.hdfs.TestDistributedFileSystem
>    [junit] Tests run: 4, Failures: 0, Errors: 1, Time elapsed: 9.057 sec
>    [junit] Test org.apache.hadoop.hdfs.TestDistributedFileSystem FAILED
> --
>    [junit] Running org.apache.hadoop.hdfs.TestFileAppend4
>    [junit] Tests run: 18, Failures: 0, Errors: 1, Time elapsed: 315.85
> sec
>    [junit] Test org.apache.hadoop.hdfs.TestFileAppend4 FAILED
> --
>    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestDiskError
>    [junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 0 sec
>    [junit] Test org.apache.hadoop.hdfs.server.datanode.TestDiskError
> FAILED (timeout)
> --
>    [junit] Running org.apache.hadoop.streaming.TestStreamingBadRecords
>    [junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 0 sec
>    [junit] Test org.apache.hadoop.streaming.TestStreamingBadRecords
> FAILED (timeout)
>
> Also, I am puzzled by the fact that HowToContribute does not mention
> building the native library but HowToRelease does.  (
> http://hadoop.apache.org/common/docs/current/native_libraries.html) says "
> Hadoop has native implementations of certain components for performance
> reasons and for non-availability of Java implementations" --- which
> suggests to me that the native library is not optional (at least for
> mapreduce, which is the client mentioned in that web page).  What is going
> on here?
>
> Thanks,
> Mike Spreitzer
>