You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Martin Stenhoff <su...@hotmail.com> on 2013/10/01 10:02:57 UTC

Building hadoop 2.1.0 beta on windows

I've just downloaded the hadoop 2.1.0 beta source and I'm not having much luck building it (well heaps of test fails)
 
According to BUILDING.txt these are the pre-reqs:
* Windows System
* JDK 1.6
* Maven 3.0
* Windows SDK or Visual Studio 2010 Professional
* ProtocolBuffer 2.4.1+ (for MapReduce and HDFS)
* Findbugs 1.3.9 (if running findbugs)
* Unix command-line tools from GnuWin32 or Cygwin: sh, mkdir, rm, cp, tar, gzip
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)
 
First ProtocolBuffers needs to be 2.5.0 (it will fail with error expected version 2.5.0 got 2.4.1).
I'm using JDK 1.7_40 x64 since one needs to register with Oracle to access JDK1.6. Is JDK1.6 a must?
I can't find sh.exe in GnuWin32 (http://gnuwin32.sourceforge.net/) so I'm using sh.exe from UnxUtils (http://unxutils.sourceforge.net/)
 
I'm running the following from a VS 2010 Ultimate Administrator cmd.exe window:
set path=C:\Software\Tools\UnxUtils\bin;C:\Software\Tools\GnuWin32Tools\bin;C:\Software\Dev\protoc-2.5.0-win32;%path%
set Platform=x64
set JAVA_HOME=C:\Software\Dev\Java\jdk1.7_x64
C:\Software\Dev\Java\build\apache-maven-3.0.5\bin\mvn package -Pdist -Pdocs -Psrc -Dtar
 
These are the failures/errors(for hadoop-common):
 
testWithStringAndConfForBuggyPath(org.apache.hadoop.fs.shell.TestPathData): checking exist
testCreateLinkToDotDot(org.apache.hadoop.fs.TestLocalFSFileContextSymlink): expected:<E:[\hdc\hadoop-common-project\hadoop-common\]target/test/data/6ju...> but was:<E:[/hdc/hadoop-common-project/hadoop-common/]target/test/data/6ju...>
testFstat(org.apache.hadoop.io.nativeio.TestNativeIO): Stat mode field should indicate a regular file expected:<32768> but was:<61440>
testInitFirstVerifyStopInvokedImmediately(org.apache.hadoop.metrics2.impl.TestMetricsSystemImpl): (..)
 
Timouts:
testPutWithoutP(org.apache.hadoop.fs.shell.TestCopyPreserveFlag): test timed out after 10000 milliseconds
testCpWithoutP(org.apache.hadoop.fs.shell.TestCopyPreserveFlag): test timed out after 10000 milliseconds
testGetWithoutP(org.apache.hadoop.fs.shell.TestCopyPreserveFlag): test timed out after 10000 milliseconds
testPutWithP(org.apache.hadoop.fs.shell.TestCopyPreserveFlag): test timed out after 10000 milliseconds
testGetWithP(org.apache.hadoop.fs.shell.TestCopyPreserveFlag): test timed out after 10000 milliseconds
testCpWithP(org.apache.hadoop.fs.shell.TestCopyPreserveFlag): test timed out after 10000 milliseconds
testReadIncorrectlyRestrictedWithSecurity(org.apache.hadoop.io.TestSecureIOUtils): test timed out after 10000 milliseconds
testProtoBufRpc2(org.apache.hadoop.ipc.TestProtoBufRpc): test timed out after 5000 milliseconds
 
Unable to create symlink: Rejecting forward-slash separated path which would result in an unusable symlink:
There's a ton of errors about invalid symlink (mostly coming from org.apache.hadoop.fs.TestLocalFSFileContextSymlink):
Unable to create symlink: Rejecting forward-slash separated path which would result in an unusable symlink: link =  E:/hdc/hadoop-common-project/hadoop-co...
 
Could not initialize class sun.security.provider.SecureRandom$SeederHolder:
testExtraLongRpc(org.apache.hadoop.ipc.TestProtoBufRpc): Could not initialize class sun.security.provider.SecureRandom$SeederHolder
testProtoBufRpc(org.apache.hadoop.ipc.TestProtoBufRpc): Could not initialize class sun.security.provider.SecureRandom$SeederHolder
testProtoBufRandomException(org.apache.hadoop.ipc.TestProtoBufRpc): Could not initialize class sun.security.provider.SecureRandom$SeederHolder
testRealUserGroupAuthorizationFailure(org.apache.hadoop.security.TestDoAsEffectiveUser): Could not initialize class sun.security.provider.SecureRandom$SeederHolder
testRealUserIPAuthorizationFailure(org.apache.hadoop.security.TestDoAsEffectiveUser): Could not initialize class sun.security.provider.SecureRandom$SeederHolder
testProxyWithToken(org.apache.hadoop.security.TestDoAsEffectiveUser): Could not initialize class sun.security.provider.SecureRandom$SeederHolder
testRealUserSetup(org.apache.hadoop.security.TestDoAsEffectiveUser): Could not initialize class sun.security.provider.SecureRandom$SeederHolder
testRealUserIPNotSpecified(org.apache.hadoop.security.TestDoAsEffectiveUser): Could not initialize class sun.security.provider.SecureRandom$SeederHolder
testRealUserGroupNotSpecified(org.apache.hadoop.security.TestDoAsEffectiveUser): Could not initialize class sun.security.provider.SecureRandom$SeederHolder
testTokenBySuperUser(org.apache.hadoop.security.TestDoAsEffectiveUser): Could not initialize class sun.security.provider.SecureRandom$SeederHolder
 
 
Anyone knows what causes these errors?
 
Thanks
 
Martin