You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-dev@hadoop.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2011/08/05 13:37:19 UTC

Hadoop-Hdfs-trunk - Build # 739 - Still Failing

See https://builds.apache.org/job/Hadoop-Hdfs-trunk/739/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1123 lines...]

forrest.check:

docs:

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build.xml:867: Execute failed: java.io.IOException: Cannot run program "/home/hudson/tools/forrest/latest/bin/forrest" (in directory "/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/docs"): java.io.IOException: error=2, No such file or directory
	at java.lang.ProcessBuilder.start(ProcessBuilder.java:460)
	at java.lang.Runtime.exec(Runtime.java:593)
	at org.apache.tools.ant.taskdefs.Execute$Java13CommandLauncher.exec(Execute.java:827)
	at org.apache.tools.ant.taskdefs.Execute.launch(Execute.java:445)
	at org.apache.tools.ant.taskdefs.Execute.execute(Execute.java:459)
	at org.apache.tools.ant.taskdefs.ExecTask.runExecute(ExecTask.java:635)
	at org.apache.tools.ant.taskdefs.ExecTask.runExec(ExecTask.java:676)
	at org.apache.tools.ant.taskdefs.ExecTask.execute(ExecTask.java:502)
	at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:291)
	at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
	at org.apache.tools.ant.Task.perform(Task.java:348)
	at org.apache.tools.ant.Target.execute(Target.java:390)
	at org.apache.tools.ant.Target.performTasks(Target.java:411)
	at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1397)
	at org.apache.tools.ant.Project.executeTarget(Project.java:1366)
	at org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:41)
	at org.apache.tools.ant.Project.executeTargets(Project.java:1249)
	at org.apache.tools.ant.Main.runBuild(Main.java:801)
	at org.apache.tools.ant.Main.startAnt(Main.java:218)
	at org.apache.tools.ant.launch.Launcher.run(Launcher.java:280)
	at org.apache.tools.ant.launch.Launcher.main(Launcher.java:109)
Caused by: java.io.IOException: java.io.IOException: error=2, No such file or directory
	at java.lang.UNIXProcess.<init>(UNIXProcess.java:148)
	at java.lang.ProcessImpl.start(ProcessImpl.java:65)
	at java.lang.ProcessBuilder.start(ProcessBuilder.java:453)
	... 23 more

Total time: 25 seconds


======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================


mv: cannot stat `build/*.tar.gz': No such file or directory
mv: cannot stat `build/test/findbugs': No such file or directory
Build Failed
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk - Build # 760 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/760/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8950 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 27.468 sec
Running org.apache.hadoop.hdfs.TestFSInputChecker
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 29.281 sec
Running org.apache.hadoop.fs.TestGlobPaths
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.635 sec
Running org.apache.hadoop.hdfs.TestModTime
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.986 sec
Running org.apache.hadoop.hdfs.TestBlockMissingException
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 35.21 sec
Running org.apache.hadoop.hdfs.TestReplication
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.209 sec
Running org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.514 sec
Running org.apache.hadoop.hdfs.protocol.TestCorruptFileBlocks
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.07 sec
Running org.apache.hadoop.hdfs.server.namenode.TestNNThroughputBenchmark
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 29.559 sec
Running org.apache.hadoop.hdfs.server.datanode.TestRefreshNamenodes
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.019 sec
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Tests run: 37, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.047 sec
Running org.apache.hadoop.cli.TestHDFSCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 71.875 sec

Results :

Failed tests: 

Tests in error: 

Tests run: 833, Failures: 4, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:11:32.340s
[INFO] Finished at: Mon Aug 22 18:53:59 UTC 2011
[INFO] Final Memory: 9M/117M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.6:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk - Build # 759 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/759/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
Started by timer
Building remotely on hadoop6
Location 'http://svn.apache.org/repos/asf/hadoop/common/trunk/hdfs' does not exist
Cleaning workspace /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk
Checking out http://svn.apache.org/repos/asf/hadoop/common/trunk/hdfs
ERROR: Failed to check out http://svn.apache.org/repos/asf/hadoop/common/trunk/hdfs
org.tmatesoft.svn.core.SVNException: svn: URL 'http://svn.apache.org/repos/asf/hadoop/common/trunk/hdfs' doesn't exist
	at org.tmatesoft.svn.core.internal.wc.SVNErrorManager.error(SVNErrorManager.java:64)
	at org.tmatesoft.svn.core.internal.wc.SVNErrorManager.error(SVNErrorManager.java:51)
	at org.tmatesoft.svn.core.wc.SVNUpdateClient.doCheckout(SVNUpdateClient.java:910)
	at hudson.scm.subversion.CheckoutUpdater$1.perform(CheckoutUpdater.java:83)
	at hudson.scm.subversion.WorkspaceUpdater$UpdateTask.delegateTo(WorkspaceUpdater.java:135)
	at hudson.scm.SubversionSCM$CheckOutTask.perform(SubversionSCM.java:726)
	at hudson.scm.SubversionSCM$CheckOutTask.invoke(SubversionSCM.java:707)
	at hudson.scm.SubversionSCM$CheckOutTask.invoke(SubversionSCM.java:691)
	at hudson.FilePath$FileCallableWrapper.call(FilePath.java:1994)
	at hudson.remoting.UserRequest.perform(UserRequest.java:118)
	at hudson.remoting.UserRequest.perform(UserRequest.java:48)
	at hudson.remoting.Request$2.run(Request.java:287)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
	at java.lang.Thread.run(Thread.java:662)
Caused by: org.tmatesoft.svn.core.SVNErrorMessage: svn: URL 'http://svn.apache.org/repos/asf/hadoop/common/trunk/hdfs' doesn't exist
	at org.tmatesoft.svn.core.SVNErrorMessage.create(SVNErrorMessage.java:163)
	at org.tmatesoft.svn.core.SVNErrorMessage.create(SVNErrorMessage.java:118)
	at org.tmatesoft.svn.core.wc.SVNUpdateClient.doCheckout(SVNUpdateClient.java:909)
	... 15 more
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording fingerprints
Recording test results
Publishing Javadoc
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk - Build # 758 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/758/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
Started by timer
Building remotely on hadoop2
Location 'http://svn.apache.org/repos/asf/hadoop/common/trunk/hdfs' does not exist
Cleaning workspace /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk
Checking out http://svn.apache.org/repos/asf/hadoop/common/trunk/hdfs
ERROR: Failed to check out http://svn.apache.org/repos/asf/hadoop/common/trunk/hdfs
org.tmatesoft.svn.core.SVNException: svn: URL 'http://svn.apache.org/repos/asf/hadoop/common/trunk/hdfs' doesn't exist
	at org.tmatesoft.svn.core.internal.wc.SVNErrorManager.error(SVNErrorManager.java:64)
	at org.tmatesoft.svn.core.internal.wc.SVNErrorManager.error(SVNErrorManager.java:51)
	at org.tmatesoft.svn.core.wc.SVNUpdateClient.doCheckout(SVNUpdateClient.java:910)
	at hudson.scm.subversion.CheckoutUpdater$1.perform(CheckoutUpdater.java:83)
	at hudson.scm.subversion.WorkspaceUpdater$UpdateTask.delegateTo(WorkspaceUpdater.java:135)
	at hudson.scm.SubversionSCM$CheckOutTask.perform(SubversionSCM.java:726)
	at hudson.scm.SubversionSCM$CheckOutTask.invoke(SubversionSCM.java:707)
	at hudson.scm.SubversionSCM$CheckOutTask.invoke(SubversionSCM.java:691)
	at hudson.FilePath$FileCallableWrapper.call(FilePath.java:1994)
	at hudson.remoting.UserRequest.perform(UserRequest.java:118)
	at hudson.remoting.UserRequest.perform(UserRequest.java:48)
	at hudson.remoting.Request$2.run(Request.java:270)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
	at java.lang.Thread.run(Thread.java:662)
Caused by: org.tmatesoft.svn.core.SVNErrorMessage: svn: URL 'http://svn.apache.org/repos/asf/hadoop/common/trunk/hdfs' doesn't exist
	at org.tmatesoft.svn.core.SVNErrorMessage.create(SVNErrorMessage.java:163)
	at org.tmatesoft.svn.core.SVNErrorMessage.create(SVNErrorMessage.java:118)
	at org.tmatesoft.svn.core.wc.SVNUpdateClient.doCheckout(SVNUpdateClient.java:909)
	... 15 more
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording fingerprints
Recording test results
Publishing Javadoc
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk - Build # 757 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/757/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1250 lines...]

clean-fi:

clean-sign:

clean:
   [delete] Deleting directory /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build

BUILD SUCCESSFUL
Total time: 0 seconds


======================================================================
======================================================================
ANALYSIS: ant -Drun.clover=true clover checkstyle test generate-clover-reports -Dtest.junit.output.format=xml -Dtest.output=yes -Dcompile.c++=true -Dcompile.native=true -Dfindbugs.home=$FINDBUGS_HOME -Dforrest.home=$FORREST_HOME -Dclover.home=$CLOVER_HOME -Declipse.home=$ECLIPSE_HOME
======================================================================
======================================================================


Buildfile: build.xml

clover.setup:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build/test/clover/db
[clover-setup] Clover Version 3.1.0, built on May 31 2011 (build-821)
[clover-setup] Loaded from: /home/jenkins/tools/clover/latest/lib/clover.jar

BUILD FAILED
java.lang.RuntimeException: Clover upgrades for your license ended December 14 2010, and this version of Clover was built May 31 2011. Please visit http://www.atlassian.com/clover/renew for information on upgrading your license.
	at com.cenqua.clover.CloverStartup.loadLicense(CloverStartup.java:103)
	at com.cenqua.clover.CloverStartup.loadLicense(CloverStartup.java:25)
	at com.cenqua.clover.tasks.AbstractCloverTask.execute(AbstractCloverTask.java:52)
	at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:288)
	at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
	at org.apache.tools.ant.Task.perform(Task.java:348)
	at org.apache.tools.ant.Target.execute(Target.java:357)
	at org.apache.tools.ant.Target.performTasks(Target.java:385)
	at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1337)
	at org.apache.tools.ant.Project.executeTarget(Project.java:1306)
	at org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:41)
	at org.apache.tools.ant.Project.executeTargets(Project.java:1189)
	at org.apache.tools.ant.Main.runBuild(Main.java:758)
	at org.apache.tools.ant.Main.startAnt(Main.java:217)
	at org.apache.tools.ant.launch.Launcher.run(Launcher.java:257)
	at org.apache.tools.ant.launch.Launcher.main(Launcher.java:104)

Total time: 0 seconds
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording fingerprints
Recording test results
Publishing Javadoc
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk - Build # 756 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/756/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1471 lines...]
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:331: cannot find symbol
    [javac] symbol  : class UnresolvedLinkException
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]       throws IOException, UnresolvedLinkException {
    [javac]                           ^
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:337: cannot find symbol
    [javac] symbol  : class Path
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]   public void renameInternal(Path src, Path dst) 
    [javac]                              ^
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:337: cannot find symbol
    [javac] symbol  : class Path
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]   public void renameInternal(Path src, Path dst) 
    [javac]                                        ^
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:338: cannot find symbol
    [javac] symbol  : class UnresolvedLinkException
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]     throws IOException, UnresolvedLinkException {
    [javac]                         ^
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:343: cannot find symbol
    [javac] symbol  : class Path
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]   public void renameInternal(Path src, Path dst, boolean overwrite)
    [javac]                              ^
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] 100 errors

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build.xml:370: Compile failed; see the compiler error output for details.

Total time: 19 seconds


======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================


mv: cannot stat `build/*.tar.gz': No such file or directory
mv: cannot stat `build/*.jar': No such file or directory
mv: cannot stat `build/test/findbugs': No such file or directory
mv: cannot stat `build/docs/api': No such file or directory
Build Failed
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording fingerprints
Updating HDFS-2265
Updating HDFS-2260
Recording test results
Publishing Javadoc
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk - Build # 755 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/755/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1467 lines...]
    [javac]   public FSDataInputStream open(Path f, int bufferSize) 
    [javac]          ^
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:331: cannot find symbol
    [javac] symbol  : class UnresolvedLinkException
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]       throws IOException, UnresolvedLinkException {
    [javac]                           ^
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:337: cannot find symbol
    [javac] symbol  : class Path
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]   public void renameInternal(Path src, Path dst) 
    [javac]                              ^
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:337: cannot find symbol
    [javac] symbol  : class Path
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]   public void renameInternal(Path src, Path dst) 
    [javac]                                        ^
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:338: cannot find symbol
    [javac] symbol  : class UnresolvedLinkException
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]     throws IOException, UnresolvedLinkException {
    [javac]                         ^
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:343: cannot find symbol
    [javac] symbol  : class Path
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]   public void renameInternal(Path src, Path dst, boolean overwrite)
    [javac]                              ^
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] 100 errors

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build.xml:370: Compile failed; see the compiler error output for details.

Total time: 17 seconds


======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================


mv: cannot stat `build/*.tar.gz': No such file or directory
mv: cannot stat `build/*.jar': No such file or directory
mv: cannot stat `build/test/findbugs': No such file or directory
mv: cannot stat `build/docs/api': No such file or directory
Build Failed
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording fingerprints
Recording test results
Publishing Javadoc
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk - Build # 754 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/754/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1243 lines...]
     [echo] contrib: fuse-dfs

clean-fi:

clean-sign:

clean:
   [delete] Deleting directory /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build

BUILD SUCCESSFUL
Total time: 0 seconds


======================================================================
======================================================================
ANALYSIS: ant -Drun.clover=true clover checkstyle test generate-clover-reports -Dtest.junit.output.format=xml -Dtest.output=yes -Dcompile.c++=true -Dcompile.native=true -Dfindbugs.home=$FINDBUGS_HOME -Dforrest.home=$FORREST_HOME -Dclover.home=$CLOVER_HOME -Declipse.home=$ECLIPSE_HOME
======================================================================
======================================================================


Buildfile: build.xml

clover.setup:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build/test/clover/db
[clover-setup] Clover Version 3.1.0, built on May 31 2011 (build-821)
[clover-setup] Loaded from: /home/jenkins/tools/clover/latest/lib/clover.jar

BUILD FAILED
java.lang.RuntimeException: Clover upgrades for your license ended December 14 2010, and this version of Clover was built May 31 2011. Please visit http://www.atlassian.com/clover/renew for information on upgrading your license.
	at com.cenqua.clover.CloverStartup.loadLicense(CloverStartup.java:103)
	at com.cenqua.clover.CloverStartup.loadLicense(CloverStartup.java:25)
	at com.cenqua.clover.tasks.AbstractCloverTask.execute(AbstractCloverTask.java:52)
	at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:288)
	at sun.reflect.GeneratedMethodAccessor1.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
	at org.apache.tools.ant.Task.perform(Task.java:348)
	at org.apache.tools.ant.Target.execute(Target.java:357)
	at org.apache.tools.ant.Target.performTasks(Target.java:385)
	at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1337)
	at org.apache.tools.ant.Project.executeTarget(Project.java:1306)
	at org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:41)
	at org.apache.tools.ant.Project.executeTargets(Project.java:1189)
	at org.apache.tools.ant.Main.runBuild(Main.java:758)
	at org.apache.tools.ant.Main.startAnt(Main.java:217)
	at org.apache.tools.ant.launch.Launcher.run(Launcher.java:257)
	at org.apache.tools.ant.launch.Launcher.main(Launcher.java:104)

Total time: 0 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording fingerprints
Recording test results
Publishing Javadoc
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk - Build # 753 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/753/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1402551 lines...]
    [junit] 2011-08-16 04:27:18,730 INFO  datanode.FSDatasetAsyncDiskService (FSDatasetAsyncDiskService.java:shutdown(142)) - All async disk service threads have been shut down.
    [junit] 2011-08-16 04:27:18,731 INFO  mortbay.log (Slf4jLog.java:info(67)) - Stopped SelectChannelConnector@localhost:0
    [junit] 2011-08-16 04:27:18,832 WARN  blockmanagement.BlockManager (BlockManager.java:run(2614)) - ReplicationMonitor thread received InterruptedException.
    [junit] java.lang.InterruptedException: sleep interrupted
    [junit] 	at java.lang.Thread.sleep(Native Method)
    [junit] 	at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager$ReplicationMonitor.run(BlockManager.java:2612)
    [junit] 	at java.lang.Thread.run(Thread.java:662)
    [junit] 2011-08-16 04:27:18,832 WARN  blockmanagement.DecommissionManager (DecommissionManager.java:run(75)) - Monitor interrupted: java.lang.InterruptedException: sleep interrupted
    [junit] 2011-08-16 04:27:18,832 INFO  namenode.FSEditLog (FSEditLog.java:endCurrentLogSegment(859)) - Ending log segment 1
    [junit] 2011-08-16 04:27:18,843 INFO  namenode.FSEditLog (FSEditLog.java:printStatistics(492)) - Number of transactions: 8 Total time for transactions(ms): 0Number of transactions batched in Syncs: 0 Number of syncs: 7 SyncTimes(ms): 56 48 
    [junit] 2011-08-16 04:27:18,844 INFO  ipc.Server (Server.java:stop(1715)) - Stopping server on 38584
    [junit] 2011-08-16 04:27:18,844 INFO  ipc.Server (Server.java:run(1539)) - IPC Server handler 0 on 38584: exiting
    [junit] 2011-08-16 04:27:18,844 INFO  ipc.Server (Server.java:run(505)) - Stopping IPC Server listener on 38584
    [junit] 2011-08-16 04:27:18,845 INFO  ipc.Server (Server.java:run(647)) - Stopping IPC Server Responder
    [junit] 2011-08-16 04:27:18,845 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stop(199)) - Stopping DataNode metrics system...
    [junit] 2011-08-16 04:27:18,845 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics
    [junit] 2011-08-16 04:27:18,845 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source NameNodeActivity
    [junit] 2011-08-16 04:27:18,845 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort38584
    [junit] 2011-08-16 04:27:18,846 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort38584
    [junit] 2011-08-16 04:27:18,846 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source FSNamesystem
    [junit] 2011-08-16 04:27:18,846 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort41691
    [junit] 2011-08-16 04:27:18,846 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort41691
    [junit] 2011-08-16 04:27:18,846 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-1
    [junit] 2011-08-16 04:27:18,847 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-asf011.sp2.ygridcore.net-43076
    [junit] 2011-08-16 04:27:18,847 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort59622
    [junit] 2011-08-16 04:27:18,847 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort59622
    [junit] 2011-08-16 04:27:18,847 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-2
    [junit] 2011-08-16 04:27:18,848 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-asf011.sp2.ygridcore.net-35471
    [junit] 2011-08-16 04:27:18,848 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort34995
    [junit] 2011-08-16 04:27:18,848 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort34995
    [junit] 2011-08-16 04:27:18,848 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-3
    [junit] 2011-08-16 04:27:18,848 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-asf011.sp2.ygridcore.net-33376
    [junit] 2011-08-16 04:27:18,849 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort49871
    [junit] 2011-08-16 04:27:18,849 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort49871
    [junit] 2011-08-16 04:27:18,849 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-4
    [junit] 2011-08-16 04:27:18,849 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-asf011.sp2.ygridcore.net-47339
    [junit] 2011-08-16 04:27:18,849 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stop(205)) - DataNode metrics system stopped.
    [junit] 2011-08-16 04:27:18,850 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:shutdown(553)) - DataNode metrics system shutdown complete.
    [junit] Tests run: 16, Failures: 0, Errors: 0, Time elapsed: 102.733 sec

checkfailure:

-run-test-hdfs-fault-inject-withtestcaseonly:

run-test-hdfs-fault-inject:

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build.xml:777: Tests failed!

Total time: 78 minutes 11 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording fingerprints
Recording test results
Publishing Javadoc
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
7 tests failed.
REGRESSION:  org.apache.hadoop.hdfs.TestDFSRollback.testRollback

Error Message:
File contents differed:   /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build/test/data/dfs/data2/current/VERSION=d0342cd292d1c7d919b5f4b000f940de   /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build/test/data/dfs/data1/current/VERSION=5baa37e94882a5fbc28d9d74bfc04db0

Stack Trace:
junit.framework.AssertionFailedError: File contents differed:
  /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build/test/data/dfs/data2/current/VERSION=d0342cd292d1c7d919b5f4b000f940de
  /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build/test/data/dfs/data1/current/VERSION=5baa37e94882a5fbc28d9d74bfc04db0
  /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build/test/data/dfs/data2/current/VERSION=d0342cd292d1c7d919b5f4b000f940de
  /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build/test/data/dfs/data1/current/VERSION=5baa37e94882a5fbc28d9d74bfc04db0
	at org.apache.hadoop.hdfs.server.namenode.FSImageTestUtil.assertFileContentsSame(FSImageTestUtil.java:251)
	at org.apache.hadoop.hdfs.server.namenode.FSImageTestUtil.assertParallelFilesAreIdentical(FSImageTestUtil.java:237)
	at org.apache.hadoop.hdfs.TestDFSRollback.checkResult(TestDFSRollback.java:86)
	at org.apache.hadoop.hdfs.TestDFSRollback.__CLR2_4_37oj5yb1gcn(TestDFSRollback.java:171)
	at org.apache.hadoop.hdfs.TestDFSRollback.testRollback(TestDFSRollback.java:134)


REGRESSION:  org.apache.hadoop.hdfs.TestParallelRead.testParallelRead

Error Message:
Timeout occurred. Please note the time in the report does not reflect the time until the timeout.

Stack Trace:
junit.framework.AssertionFailedError: Timeout occurred. Please note the time in the report does not reflect the time until the timeout.


FAILED:  org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts

Error Message:
Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible.

Stack Trace:
org.apache.hadoop.hdfs.server.common.InconsistentFSStateException: Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible.
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$CheckpointStorage.recoverCreate(SecondaryNameNode.java:801)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:222)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:175)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:168)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.canStartSecondaryNode(TestHDFSServerPorts.java:224)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.__CLR2_4_3vpy47p1541(TestHDFSServerPorts.java:350)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts(TestHDFSServerPorts.java:339)


FAILED:  org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1367)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:242)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:113)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:626)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:541)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:257)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:85)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:243)
	at org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.__CLR2_4_3harbaz1hcb(TestCheckpoint.java:560)
	at org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking(TestCheckpoint.java:553)


FAILED:  org.apache.hadoop.hdfs.server.namenode.TestNNThroughputBenchmark.testNNThroughput

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1367)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:242)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:113)
	at org.apache.hadoop.hdfs.server.namenode.TestNNThroughputBenchmark.__CLR2_4_3b2i9ur1f75(TestNNThroughputBenchmark.java:39)
	at org.apache.hadoop.hdfs.server.namenode.TestNNThroughputBenchmark.testNNThroughput(TestNNThroughputBenchmark.java:35)


FAILED:  org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatMatchingRPCandHttpPortsThrowException

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1367)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:242)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:113)
	at org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.__CLR2_4_3b49o261bi8(TestValidateConfigurationSettings.java:49)
	at org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatMatchingRPCandHttpPortsThrowException(TestValidateConfigurationSettings.java:43)


FAILED:  org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatDifferentRPCandHttpPortsAreOK

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1367)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:242)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:113)
	at org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.__CLR2_4_3ihms9r1bii(TestValidateConfigurationSettings.java:71)
	at org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatDifferentRPCandHttpPortsAreOK(TestValidateConfigurationSettings.java:66)




Hadoop-Hdfs-trunk - Build # 752 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/752/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1466 lines...]
    [javac]   public FSDataInputStream open(Path f, int bufferSize) 
    [javac]          ^
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:331: cannot find symbol
    [javac] symbol  : class UnresolvedLinkException
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]       throws IOException, UnresolvedLinkException {
    [javac]                           ^
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:337: cannot find symbol
    [javac] symbol  : class Path
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]   public void renameInternal(Path src, Path dst) 
    [javac]                              ^
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:337: cannot find symbol
    [javac] symbol  : class Path
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]   public void renameInternal(Path src, Path dst) 
    [javac]                                        ^
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:338: cannot find symbol
    [javac] symbol  : class UnresolvedLinkException
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]     throws IOException, UnresolvedLinkException {
    [javac]                         ^
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:343: cannot find symbol
    [javac] symbol  : class Path
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]   public void renameInternal(Path src, Path dst, boolean overwrite)
    [javac]                              ^
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] 100 errors

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build.xml:370: Compile failed; see the compiler error output for details.

Total time: 16 seconds


======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================


mv: cannot stat `build/*.tar.gz': No such file or directory
mv: cannot stat `build/*.jar': No such file or directory
mv: cannot stat `build/test/findbugs': No such file or directory
mv: cannot stat `build/docs/api': No such file or directory
Build Failed
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording fingerprints
Updating HDFS-2233
Recording test results
Publishing Javadoc
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk - Build # 751 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/751/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1499156 lines...]
    [junit] 2011-08-15 22:17:08,057 INFO  datanode.FSDatasetAsyncDiskService (FSDatasetAsyncDiskService.java:shutdown(142)) - All async disk service threads have been shut down.
    [junit] 2011-08-15 22:17:08,057 INFO  mortbay.log (Slf4jLog.java:info(67)) - Stopped SelectChannelConnector@localhost:0
    [junit] 2011-08-15 22:17:08,158 WARN  blockmanagement.DecommissionManager (DecommissionManager.java:run(75)) - Monitor interrupted: java.lang.InterruptedException: sleep interrupted
    [junit] 2011-08-15 22:17:08,158 WARN  blockmanagement.BlockManager (BlockManager.java:run(2614)) - ReplicationMonitor thread received InterruptedException.
    [junit] java.lang.InterruptedException: sleep interrupted
    [junit] 	at java.lang.Thread.sleep(Native Method)
    [junit] 	at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager$ReplicationMonitor.run(BlockManager.java:2612)
    [junit] 	at java.lang.Thread.run(Thread.java:662)
    [junit] 2011-08-15 22:17:08,159 INFO  namenode.FSEditLog (FSEditLog.java:endCurrentLogSegment(859)) - Ending log segment 1
    [junit] 2011-08-15 22:17:08,170 INFO  namenode.FSEditLog (FSEditLog.java:printStatistics(492)) - Number of transactions: 8 Total time for transactions(ms): 0Number of transactions batched in Syncs: 0 Number of syncs: 7 SyncTimes(ms): 94 83 
    [junit] 2011-08-15 22:17:08,171 INFO  ipc.Server (Server.java:stop(1715)) - Stopping server on 42042
    [junit] 2011-08-15 22:17:08,172 INFO  ipc.Server (Server.java:run(1539)) - IPC Server handler 0 on 42042: exiting
    [junit] 2011-08-15 22:17:08,172 INFO  ipc.Server (Server.java:run(505)) - Stopping IPC Server listener on 42042
    [junit] 2011-08-15 22:17:08,172 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stop(199)) - Stopping DataNode metrics system...
    [junit] 2011-08-15 22:17:08,172 INFO  ipc.Server (Server.java:run(647)) - Stopping IPC Server Responder
    [junit] 2011-08-15 22:17:08,172 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics
    [junit] 2011-08-15 22:17:08,173 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source NameNodeActivity
    [junit] 2011-08-15 22:17:08,173 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort42042
    [junit] 2011-08-15 22:17:08,173 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort42042
    [junit] 2011-08-15 22:17:08,173 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source FSNamesystem
    [junit] 2011-08-15 22:17:08,173 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort47877
    [junit] 2011-08-15 22:17:08,174 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort47877
    [junit] 2011-08-15 22:17:08,174 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-1
    [junit] 2011-08-15 22:17:08,174 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-asf011.sp2.ygridcore.net-34652
    [junit] 2011-08-15 22:17:08,174 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort55190
    [junit] 2011-08-15 22:17:08,174 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort55190
    [junit] 2011-08-15 22:17:08,175 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-2
    [junit] 2011-08-15 22:17:08,175 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-asf011.sp2.ygridcore.net-57909
    [junit] 2011-08-15 22:17:08,175 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort51148
    [junit] 2011-08-15 22:17:08,175 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort51148
    [junit] 2011-08-15 22:17:08,176 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-3
    [junit] 2011-08-15 22:17:08,176 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-asf011.sp2.ygridcore.net-46009
    [junit] 2011-08-15 22:17:08,176 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort60950
    [junit] 2011-08-15 22:17:08,176 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort60950
    [junit] 2011-08-15 22:17:08,176 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-4
    [junit] 2011-08-15 22:17:08,177 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-asf011.sp2.ygridcore.net-35625
    [junit] 2011-08-15 22:17:08,177 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stop(205)) - DataNode metrics system stopped.
    [junit] 2011-08-15 22:17:08,177 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:shutdown(553)) - DataNode metrics system shutdown complete.
    [junit] Tests run: 16, Failures: 0, Errors: 0, Time elapsed: 102.287 sec

checkfailure:

-run-test-hdfs-fault-inject-withtestcaseonly:

run-test-hdfs-fault-inject:

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build.xml:777: Tests failed!

Total time: 61 minutes 22 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording fingerprints
Recording test results
Publishing Javadoc
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
15 tests failed.
REGRESSION:  org.apache.hadoop.hdfs.server.datanode.TestDataDirs.testGetDataDirsFromURIs

Error Message:
org/apache/hadoop/fs/permission/FsPermission

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/fs/permission/FsPermission
	at org.apache.hadoop.hdfs.server.datanode.TestDataDirs.testGetDataDirsFromURIs(TestDataDirs.java:42)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.permission.FsPermission
	at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:247)


FAILED:  org.apache.hadoop.hdfs.server.namenode.TestGetImageServlet.initializationError

Error Message:
org/apache/hadoop/conf/Configuration

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
	at java.lang.Class.getDeclaredMethods0(Native Method)
	at java.lang.Class.privateGetDeclaredMethods(Class.java:2427)
	at java.lang.Class.getDeclaredMethods(Class.java:1791)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration
	at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:247)


REGRESSION:  org.apache.hadoop.hdfs.server.namenode.TestINodeFile.testReplication

Error Message:
org/apache/hadoop/io/Writable

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/io/Writable
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
	at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
	at org.apache.hadoop.hdfs.server.namenode.TestINodeFile.testReplication(TestINodeFile.java:47)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.io.Writable
	at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:247)


REGRESSION:  org.apache.hadoop.hdfs.server.namenode.TestINodeFile.testReplicationBelowLowerBound

Error Message:
Unexpected exception, expected<java.lang.IllegalArgumentException> but was<java.lang.NoClassDefFoundError>

Stack Trace:
java.lang.Exception: Unexpected exception, expected<java.lang.IllegalArgumentException> but was<java.lang.NoClassDefFoundError>
Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/io/Writable
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
	at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
	at org.apache.hadoop.hdfs.server.namenode.TestINodeFile.testReplicationBelowLowerBound(TestINodeFile.java:64)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.io.Writable
	at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:247)


REGRESSION:  org.apache.hadoop.hdfs.server.namenode.TestINodeFile.testPreferredBlockSize

Error Message:
org/apache/hadoop/io/Writable

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/io/Writable
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
	at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
	at org.apache.hadoop.hdfs.server.namenode.TestINodeFile.testPreferredBlockSize(TestINodeFile.java:77)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.io.Writable
	at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:247)


REGRESSION:  org.apache.hadoop.hdfs.server.namenode.TestINodeFile.testPreferredBlockSizeUpperBound

Error Message:
org/apache/hadoop/io/Writable

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/io/Writable
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
	at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
	at org.apache.hadoop.hdfs.server.namenode.TestINodeFile.testPreferredBlockSizeUpperBound(TestINodeFile.java:88)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.io.Writable
	at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:247)


REGRESSION:  org.apache.hadoop.hdfs.server.namenode.TestINodeFile.testPreferredBlockSizeBelowLowerBound

Error Message:
Unexpected exception, expected<java.lang.IllegalArgumentException> but was<java.lang.NoClassDefFoundError>

Stack Trace:
java.lang.Exception: Unexpected exception, expected<java.lang.IllegalArgumentException> but was<java.lang.NoClassDefFoundError>
Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/io/Writable
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
	at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
	at org.apache.hadoop.hdfs.server.namenode.TestINodeFile.testPreferredBlockSizeBelowLowerBound(TestINodeFile.java:105)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.io.Writable
	at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:247)


REGRESSION:  org.apache.hadoop.hdfs.server.namenode.TestINodeFile.testPreferredBlockSizeAboveUpperBound

Error Message:
Unexpected exception, expected<java.lang.IllegalArgumentException> but was<java.lang.NoClassDefFoundError>

Stack Trace:
java.lang.Exception: Unexpected exception, expected<java.lang.IllegalArgumentException> but was<java.lang.NoClassDefFoundError>
Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/io/Writable
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClassCond(ClassLoader.java:631)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:615)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:141)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:283)
	at java.net.URLClassLoader.access$000(URLClassLoader.java:58)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:197)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
	at org.apache.hadoop.hdfs.server.namenode.TestINodeFile.testPreferredBlockSizeAboveUpperBound(TestINodeFile.java:120)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.io.Writable
	at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:247)


REGRESSION:  org.apache.hadoop.hdfs.server.namenode.TestINodeFile.testGetFullPathName

Error Message:
org/apache/hadoop/fs/permission/PermissionStatus

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/fs/permission/PermissionStatus
	at org.apache.hadoop.hdfs.server.namenode.TestINodeFile.testGetFullPathName(TestINodeFile.java:127)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.permission.PermissionStatus
	at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:247)


REGRESSION:  org.apache.hadoop.hdfs.server.namenode.TestINodeFile.testAppendBlocks

Error Message:
org/apache/hadoop/fs/permission/PermissionStatus

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/fs/permission/PermissionStatus
	at org.apache.hadoop.hdfs.server.namenode.TestINodeFile.createINodeFiles(TestINodeFile.java:195)
	at org.apache.hadoop.hdfs.server.namenode.TestINodeFile.testAppendBlocks(TestINodeFile.java:157)


FAILED:  org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts

Error Message:
Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible.

Stack Trace:
org.apache.hadoop.hdfs.server.common.InconsistentFSStateException: Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible.
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$CheckpointStorage.recoverCreate(SecondaryNameNode.java:801)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:222)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:175)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:168)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.canStartSecondaryNode(TestHDFSServerPorts.java:224)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts(TestHDFSServerPorts.java:350)


FAILED:  org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1367)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:242)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:113)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:626)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:541)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:257)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:85)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:243)
	at org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking(TestCheckpoint.java:560)


FAILED:  org.apache.hadoop.hdfs.server.namenode.TestNNThroughputBenchmark.testNNThroughput

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1367)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:242)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:113)
	at org.apache.hadoop.hdfs.server.namenode.TestNNThroughputBenchmark.testNNThroughput(TestNNThroughputBenchmark.java:39)


FAILED:  org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatMatchingRPCandHttpPortsThrowException

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1367)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:242)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:113)
	at org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatMatchingRPCandHttpPortsThrowException(TestValidateConfigurationSettings.java:49)


FAILED:  org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatDifferentRPCandHttpPortsAreOK

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1367)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:242)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:113)
	at org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatDifferentRPCandHttpPortsAreOK(TestValidateConfigurationSettings.java:71)




Hadoop-Hdfs-trunk - Build # 750 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/750/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 924 lines...]
A         buildMR-279Branch.sh
AU        hudsonBuildHadoopPatch.sh
AU        hudsonBuildHadoopRelease.sh
AU        processHadoopPatchEmailRemote.sh
AU        hudsonPatchQueueAdmin.sh
AU        processHadoopPatchEmail.sh
A         README.txt
A         test-patch
A         test-patch/test-patch.sh
At revision 1157923
no change for http://svn.apache.org/repos/asf/hadoop/nightly since the previous build
no change for https://svn.apache.org/repos/asf/hadoop/common/trunk/common/src/test/bin since the previous build
No emails were triggered.
[Hadoop-Hdfs-trunk] $ /bin/bash /tmp/hudson5719509509992190546.sh
1024


======================================================================
======================================================================
CLEAN: cleaning workspace
======================================================================
======================================================================


/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/nightly/commitBuild.sh: line 43: /home/hudson/tools/ant/latest/bin/ant: No such file or directory


======================================================================
======================================================================
BUILD: ant clean binary findbugs -Dtest.junit.output.format=xml -Dtest.output=yes -Dcompile.c++=true -Dcompile.native=true -Dfindbugs.home=$FINDBUGS_HOME -Dforrest.home=$FORREST_HOME -Dclover.home=$CLOVER_HOME -Declipse.home=$ECLIPSE_HOME
======================================================================
======================================================================


/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/nightly/commitBuild.sh: line 20: /home/hudson/tools/ant/latest/bin/ant: No such file or directory


======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================


mv: cannot stat `build/*.tar.gz': No such file or directory
mv: cannot stat `build/*.jar': No such file or directory
mv: cannot stat `build/test/findbugs': No such file or directory
mv: cannot stat `build/docs/api': No such file or directory
Build Failed
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording fingerprints
Updating HDFS-73
Recording test results
Publishing Javadoc
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk - Build # 749 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/749/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1468 lines...]
    [javac]          ^
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:331: cannot find symbol
    [javac] symbol  : class UnresolvedLinkException
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]       throws IOException, UnresolvedLinkException {
    [javac]                           ^
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:337: cannot find symbol
    [javac] symbol  : class Path
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]   public void renameInternal(Path src, Path dst) 
    [javac]                              ^
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:337: cannot find symbol
    [javac] symbol  : class Path
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]   public void renameInternal(Path src, Path dst) 
    [javac]                                        ^
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:338: cannot find symbol
    [javac] symbol  : class UnresolvedLinkException
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]     throws IOException, UnresolvedLinkException {
    [javac]                         ^
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:343: cannot find symbol
    [javac] symbol  : class Path
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]   public void renameInternal(Path src, Path dst, boolean overwrite)
    [javac]                              ^
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] 100 errors

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build.xml:370: Compile failed; see the compiler error output for details.

Total time: 17 seconds


======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================


mv: cannot stat `build/*.tar.gz': No such file or directory
mv: cannot stat `build/*.jar': No such file or directory
mv: cannot stat `build/test/findbugs': No such file or directory
mv: cannot stat `build/docs/api': No such file or directory
Build Failed
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Error updating JIRA issues. Saving issues for next build.
com.atlassian.jira.rpc.exception.RemotePermissionException: This issue does not exist or you don't have permission to view it.
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk - Build # 748 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/748/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1467 lines...]
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:331: cannot find symbol
    [javac] symbol  : class UnresolvedLinkException
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]       throws IOException, UnresolvedLinkException {
    [javac]                           ^
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:337: cannot find symbol
    [javac] symbol  : class Path
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]   public void renameInternal(Path src, Path dst) 
    [javac]                              ^
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:337: cannot find symbol
    [javac] symbol  : class Path
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]   public void renameInternal(Path src, Path dst) 
    [javac]                                        ^
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:338: cannot find symbol
    [javac] symbol  : class UnresolvedLinkException
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]     throws IOException, UnresolvedLinkException {
    [javac]                         ^
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:343: cannot find symbol
    [javac] symbol  : class Path
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]   public void renameInternal(Path src, Path dst, boolean overwrite)
    [javac]                              ^
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] 100 errors

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build.xml:370: Compile failed; see the compiler error output for details.

Total time: 17 seconds


======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================


mv: cannot stat `build/*.tar.gz': No such file or directory
mv: cannot stat `build/*.jar': No such file or directory
mv: cannot stat `build/test/findbugs': No such file or directory
mv: cannot stat `build/docs/api': No such file or directory
Build Failed
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Updating HDFS-2237
Updating HDFS-2245
Updating HDFS-2241
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk - Build # 747 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/747/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1124054 lines...]
    [junit] 2011-08-10 22:55:14,352 INFO  datanode.FSDatasetAsyncDiskService (FSDatasetAsyncDiskService.java:shutdown(142)) - All async disk service threads have been shut down.
    [junit] 2011-08-10 22:55:14,353 INFO  mortbay.log (Slf4jLog.java:info(67)) - Stopped SelectChannelConnector@localhost:0
    [junit] 2011-08-10 22:55:14,454 WARN  blockmanagement.DecommissionManager (DecommissionManager.java:run(75)) - Monitor interrupted: java.lang.InterruptedException: sleep interrupted
    [junit] 2011-08-10 22:55:14,454 INFO  namenode.FSEditLog (FSEditLog.java:endCurrentLogSegment(859)) - Ending log segment 1
    [junit] 2011-08-10 22:55:14,454 WARN  blockmanagement.BlockManager (BlockManager.java:run(2611)) - ReplicationMonitor thread received InterruptedException.
    [junit] java.lang.InterruptedException: sleep interrupted
    [junit] 	at java.lang.Thread.sleep(Native Method)
    [junit] 	at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager$ReplicationMonitor.run(BlockManager.java:2609)
    [junit] 	at java.lang.Thread.run(Thread.java:619)
    [junit] 2011-08-10 22:55:14,466 INFO  namenode.FSEditLog (FSEditLog.java:printStatistics(492)) - Number of transactions: 8 Total time for transactions(ms): 0Number of transactions batched in Syncs: 0 Number of syncs: 7 SyncTimes(ms): 55 43 
    [junit] 2011-08-10 22:55:14,467 INFO  ipc.Server (Server.java:stop(1715)) - Stopping server on 43796
    [junit] 2011-08-10 22:55:14,467 INFO  ipc.Server (Server.java:run(1539)) - IPC Server handler 0 on 43796: exiting
    [junit] 2011-08-10 22:55:14,467 INFO  ipc.Server (Server.java:run(505)) - Stopping IPC Server listener on 43796
    [junit] 2011-08-10 22:55:14,467 INFO  ipc.Server (Server.java:run(647)) - Stopping IPC Server Responder
    [junit] 2011-08-10 22:55:14,467 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stop(199)) - Stopping DataNode metrics system...
    [junit] 2011-08-10 22:55:14,468 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics
    [junit] 2011-08-10 22:55:14,468 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source NameNodeActivity
    [junit] 2011-08-10 22:55:14,468 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort43796
    [junit] 2011-08-10 22:55:14,469 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort43796
    [junit] 2011-08-10 22:55:14,469 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source FSNamesystem
    [junit] 2011-08-10 22:55:14,469 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort49200
    [junit] 2011-08-10 22:55:14,469 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort49200
    [junit] 2011-08-10 22:55:14,469 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-1
    [junit] 2011-08-10 22:55:14,470 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-48963
    [junit] 2011-08-10 22:55:14,470 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort53818
    [junit] 2011-08-10 22:55:14,470 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort53818
    [junit] 2011-08-10 22:55:14,470 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-2
    [junit] 2011-08-10 22:55:14,471 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-46739
    [junit] 2011-08-10 22:55:14,471 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort33025
    [junit] 2011-08-10 22:55:14,471 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort33025
    [junit] 2011-08-10 22:55:14,471 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-3
    [junit] 2011-08-10 22:55:14,472 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-41472
    [junit] 2011-08-10 22:55:14,472 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort48903
    [junit] 2011-08-10 22:55:14,472 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort48903
    [junit] 2011-08-10 22:55:14,472 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-4
    [junit] 2011-08-10 22:55:14,473 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-58763
    [junit] 2011-08-10 22:55:14,473 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stop(205)) - DataNode metrics system stopped.
    [junit] 2011-08-10 22:55:14,473 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:shutdown(553)) - DataNode metrics system shutdown complete.
    [junit] Tests run: 16, Failures: 0, Errors: 0, Time elapsed: 119.129 sec

checkfailure:

-run-test-hdfs-fault-inject-withtestcaseonly:

run-test-hdfs-fault-inject:

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build.xml:777: Tests failed!

Total time: 118 minutes 28 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts

Error Message:
Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible.

Stack Trace:
org.apache.hadoop.hdfs.server.common.InconsistentFSStateException: Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible.
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$CheckpointStorage.recoverCreate(SecondaryNameNode.java:801)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:222)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:175)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:168)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.canStartSecondaryNode(TestHDFSServerPorts.java:224)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.__CLR2_4_3vpy47p151r(TestHDFSServerPorts.java:350)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts(TestHDFSServerPorts.java:339)


FAILED:  org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1362)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:237)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:112)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:626)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:541)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:257)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:85)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:243)
	at org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.__CLR2_4_3harbaz1h97(TestCheckpoint.java:560)
	at org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking(TestCheckpoint.java:553)


FAILED:  org.apache.hadoop.hdfs.server.namenode.TestNNThroughputBenchmark.testNNThroughput

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1362)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:237)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:112)
	at org.apache.hadoop.hdfs.server.namenode.TestNNThroughputBenchmark.__CLR2_4_3b2i9ur1f41(TestNNThroughputBenchmark.java:39)
	at org.apache.hadoop.hdfs.server.namenode.TestNNThroughputBenchmark.testNNThroughput(TestNNThroughputBenchmark.java:35)


FAILED:  org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatMatchingRPCandHttpPortsThrowException

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1362)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:237)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:112)
	at org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.__CLR2_4_3b49o261bff(TestValidateConfigurationSettings.java:49)
	at org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatMatchingRPCandHttpPortsThrowException(TestValidateConfigurationSettings.java:43)


FAILED:  org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatDifferentRPCandHttpPortsAreOK

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1362)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:237)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:112)
	at org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.__CLR2_4_3ihms9r1bfp(TestValidateConfigurationSettings.java:71)
	at org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatDifferentRPCandHttpPortsAreOK(TestValidateConfigurationSettings.java:66)




Hadoop-Hdfs-trunk - Build # 746 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/746/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1466 lines...]
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]   public FSDataInputStream open(Path f, int bufferSize) 
    [javac]          ^
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:331: cannot find symbol
    [javac] symbol  : class UnresolvedLinkException
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]       throws IOException, UnresolvedLinkException {
    [javac]                           ^
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:337: cannot find symbol
    [javac] symbol  : class Path
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]   public void renameInternal(Path src, Path dst) 
    [javac]                              ^
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:337: cannot find symbol
    [javac] symbol  : class Path
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]   public void renameInternal(Path src, Path dst) 
    [javac]                                        ^
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:338: cannot find symbol
    [javac] symbol  : class UnresolvedLinkException
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]     throws IOException, UnresolvedLinkException {
    [javac]                         ^
    [javac] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/src/java/org/apache/hadoop/fs/Hdfs.java:343: cannot find symbol
    [javac] symbol  : class Path
    [javac] location: class org.apache.hadoop.fs.Hdfs
    [javac]   public void renameInternal(Path src, Path dst, boolean overwrite)
    [javac]                              ^
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] 100 errors

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build.xml:370: Compile failed; see the compiler error output for details.

Total time: 19 seconds


======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================


mv: cannot stat `build/*.tar.gz': No such file or directory
mv: cannot stat `build/*.jar': No such file or directory
mv: cannot stat `build/test/findbugs': No such file or directory
mv: cannot stat `build/docs/api': No such file or directory
Build Failed
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk - Build # 745 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/745/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1123457 lines...]
    [junit] 2011-08-10 13:31:21,981 WARN  blockmanagement.BlockManager (BlockManager.java:run(2611)) - ReplicationMonitor thread received InterruptedException.
    [junit] java.lang.InterruptedException: sleep interrupted
    [junit] 	at java.lang.Thread.sleep(Native Method)
    [junit] 	at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager$ReplicationMonitor.run(BlockManager.java:2609)
    [junit] 	at java.lang.Thread.run(Thread.java:619)
    [junit] 2011-08-10 13:31:21,981 INFO  namenode.FSEditLog (FSEditLog.java:endCurrentLogSegment(859)) - Ending log segment 1
    [junit] 2011-08-10 13:31:21,981 WARN  blockmanagement.DecommissionManager (DecommissionManager.java:run(75)) - Monitor interrupted: java.lang.InterruptedException: sleep interrupted
    [junit] 2011-08-10 13:31:21,991 INFO  namenode.FSEditLog (FSEditLog.java:printStatistics(492)) - Number of transactions: 8 Total time for transactions(ms): 0Number of transactions batched in Syncs: 0 Number of syncs: 7 SyncTimes(ms): 63 77 
    [junit] 2011-08-10 13:31:21,993 INFO  ipc.Server (Server.java:stop(1715)) - Stopping server on 55483
    [junit] 2011-08-10 13:31:21,993 INFO  ipc.Server (Server.java:run(1539)) - IPC Server handler 0 on 55483: exiting
    [junit] 2011-08-10 13:31:21,994 INFO  ipc.Server (Server.java:run(505)) - Stopping IPC Server listener on 55483
    [junit] 2011-08-10 13:31:21,994 INFO  ipc.Server (Server.java:run(647)) - Stopping IPC Server Responder
    [junit] 2011-08-10 13:31:21,994 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stop(199)) - Stopping DataNode metrics system...
    [junit] 2011-08-10 13:31:21,994 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics
    [junit] 2011-08-10 13:31:21,994 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source NameNodeActivity
    [junit] 2011-08-10 13:31:21,995 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort55483
    [junit] 2011-08-10 13:31:21,995 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort55483
    [junit] 2011-08-10 13:31:21,995 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source FSNamesystem
    [junit] 2011-08-10 13:31:21,995 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort56715
    [junit] 2011-08-10 13:31:21,996 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort56715
    [junit] 2011-08-10 13:31:21,996 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-1
    [junit] 2011-08-10 13:31:21,996 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-36722
    [junit] 2011-08-10 13:31:21,996 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort47304
    [junit] 2011-08-10 13:31:21,996 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort47304
    [junit] 2011-08-10 13:31:21,997 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-2
    [junit] 2011-08-10 13:31:21,997 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-53148
    [junit] 2011-08-10 13:31:21,997 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort50799
    [junit] 2011-08-10 13:31:21,997 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort50799
    [junit] 2011-08-10 13:31:21,998 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-3
    [junit] 2011-08-10 13:31:21,998 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-52118
    [junit] 2011-08-10 13:31:21,998 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort53914
    [junit] 2011-08-10 13:31:21,998 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort53914
    [junit] 2011-08-10 13:31:21,999 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-4
    [junit] 2011-08-10 13:31:21,999 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-47998
    [junit] 2011-08-10 13:31:21,999 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stop(205)) - DataNode metrics system stopped.
    [junit] 2011-08-10 13:31:21,999 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:shutdown(553)) - DataNode metrics system shutdown complete.
    [junit] Tests run: 16, Failures: 0, Errors: 0, Time elapsed: 120.515 sec

checkfailure:

-run-test-hdfs-fault-inject-withtestcaseonly:

run-test-hdfs-fault-inject:

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build.xml:777: Tests failed!

Total time: 118 minutes 46 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Updating HDFS-2227
Updating HDFS-2239
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.cli.TestHDFSCLI.initializationError

Error Message:
Lorg/apache/hadoop/fs/FileSystem;

Stack Trace:
java.lang.NoClassDefFoundError: Lorg/apache/hadoop/fs/FileSystem;
	at java.lang.Class.getDeclaredFields0(Native Method)
	at java.lang.Class.privateGetDeclaredFields(Class.java:2291)
	at java.lang.Class.getDeclaredFields(Class.java:1743)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.FileSystem
	at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:248)


FAILED:  org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts

Error Message:
Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible.

Stack Trace:
org.apache.hadoop.hdfs.server.common.InconsistentFSStateException: Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible.
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$CheckpointStorage.recoverCreate(SecondaryNameNode.java:801)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:222)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:175)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:168)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.canStartSecondaryNode(TestHDFSServerPorts.java:224)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.__CLR2_4_3vpy47p151r(TestHDFSServerPorts.java:350)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts(TestHDFSServerPorts.java:339)


FAILED:  org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1362)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:237)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:112)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:626)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:541)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:257)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:85)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:243)
	at org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.__CLR2_4_3harbaz1h97(TestCheckpoint.java:560)
	at org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking(TestCheckpoint.java:553)


FAILED:  org.apache.hadoop.hdfs.server.namenode.TestNNThroughputBenchmark.testNNThroughput

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1362)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:237)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:112)
	at org.apache.hadoop.hdfs.server.namenode.TestNNThroughputBenchmark.__CLR2_4_3b2i9ur1f41(TestNNThroughputBenchmark.java:39)
	at org.apache.hadoop.hdfs.server.namenode.TestNNThroughputBenchmark.testNNThroughput(TestNNThroughputBenchmark.java:35)


FAILED:  org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatMatchingRPCandHttpPortsThrowException

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1362)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:237)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:112)
	at org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.__CLR2_4_3b49o261bff(TestValidateConfigurationSettings.java:49)
	at org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatMatchingRPCandHttpPortsThrowException(TestValidateConfigurationSettings.java:43)


FAILED:  org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatDifferentRPCandHttpPortsAreOK

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1362)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:237)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:112)
	at org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.__CLR2_4_3ihms9r1bfp(TestValidateConfigurationSettings.java:71)
	at org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatDifferentRPCandHttpPortsAreOK(TestValidateConfigurationSettings.java:66)




Hadoop-Hdfs-trunk - Build # 744 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/744/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1082561 lines...]
    [junit] 2011-08-09 13:31:14,572 WARN  blockmanagement.BlockManager (BlockManager.java:run(2604)) - ReplicationMonitor thread received InterruptedException.
    [junit] java.lang.InterruptedException: sleep interrupted
    [junit] 	at java.lang.Thread.sleep(Native Method)
    [junit] 	at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager$ReplicationMonitor.run(BlockManager.java:2602)
    [junit] 	at java.lang.Thread.run(Thread.java:619)
    [junit] 2011-08-09 13:31:14,572 INFO  namenode.FSEditLog (FSEditLog.java:endCurrentLogSegment(822)) - Ending log segment 1
    [junit] 2011-08-09 13:31:14,572 WARN  blockmanagement.DecommissionManager (DecommissionManager.java:run(75)) - Monitor interrupted: java.lang.InterruptedException: sleep interrupted
    [junit] 2011-08-09 13:31:14,581 INFO  namenode.FSEditLog (FSEditLog.java:printStatistics(501)) - Number of transactions: 8 Total time for transactions(ms): 1Number of transactions batched in Syncs: 0 Number of syncs: 7 SyncTimes(ms): 45 38 
    [junit] 2011-08-09 13:31:14,583 INFO  ipc.Server (Server.java:stop(1715)) - Stopping server on 60052
    [junit] 2011-08-09 13:31:14,583 INFO  ipc.Server (Server.java:run(1539)) - IPC Server handler 0 on 60052: exiting
    [junit] 2011-08-09 13:31:14,584 INFO  ipc.Server (Server.java:run(505)) - Stopping IPC Server listener on 60052
    [junit] 2011-08-09 13:31:14,584 INFO  ipc.Server (Server.java:run(647)) - Stopping IPC Server Responder
    [junit] 2011-08-09 13:31:14,584 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stop(199)) - Stopping DataNode metrics system...
    [junit] 2011-08-09 13:31:14,584 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics
    [junit] 2011-08-09 13:31:14,584 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source NameNodeActivity
    [junit] 2011-08-09 13:31:14,584 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort60052
    [junit] 2011-08-09 13:31:14,585 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort60052
    [junit] 2011-08-09 13:31:14,585 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source FSNamesystem
    [junit] 2011-08-09 13:31:14,585 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort54789
    [junit] 2011-08-09 13:31:14,585 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort54789
    [junit] 2011-08-09 13:31:14,586 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-1
    [junit] 2011-08-09 13:31:14,586 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-41739
    [junit] 2011-08-09 13:31:14,586 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort33237
    [junit] 2011-08-09 13:31:14,586 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort33237
    [junit] 2011-08-09 13:31:14,587 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-2
    [junit] 2011-08-09 13:31:14,587 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-34160
    [junit] 2011-08-09 13:31:14,587 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort58558
    [junit] 2011-08-09 13:31:14,587 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort58558
    [junit] 2011-08-09 13:31:14,588 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-3
    [junit] 2011-08-09 13:31:14,588 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-58667
    [junit] 2011-08-09 13:31:14,588 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort39400
    [junit] 2011-08-09 13:31:14,588 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort39400
    [junit] 2011-08-09 13:31:14,588 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-4
    [junit] 2011-08-09 13:31:14,589 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-39667
    [junit] 2011-08-09 13:31:14,589 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stop(205)) - DataNode metrics system stopped.
    [junit] 2011-08-09 13:31:14,589 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:shutdown(553)) - DataNode metrics system shutdown complete.
    [junit] Tests run: 16, Failures: 0, Errors: 0, Time elapsed: 120.248 sec

checkfailure:

-run-test-hdfs-fault-inject-withtestcaseonly:

run-test-hdfs-fault-inject:

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build.xml:777: Tests failed!

Total time: 118 minutes 43 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Updating HDFS-2238
Updating HDFS-2230
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
7 tests failed.
REGRESSION:  org.apache.hadoop.hdfs.TestFileAppend2.testComplexAppend

Error Message:
testComplexAppend Worker encountered exceptions.

Stack Trace:
junit.framework.AssertionFailedError: testComplexAppend Worker encountered exceptions.
	at org.apache.hadoop.hdfs.TestFileAppend2.__CLR2_4_3dvc5331967(TestFileAppend2.java:387)
	at org.apache.hadoop.hdfs.TestFileAppend2.testComplexAppend(TestFileAppend2.java:332)


REGRESSION:  org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts

Error Message:
Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible.

Stack Trace:
org.apache.hadoop.hdfs.server.common.InconsistentFSStateException: Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible.
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$CheckpointStorage.recoverCreate(SecondaryNameNode.java:801)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:222)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:175)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:168)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.canStartSecondaryNode(TestHDFSServerPorts.java:224)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.__CLR2_4_3vpy47p150k(TestHDFSServerPorts.java:350)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts(TestHDFSServerPorts.java:339)


REGRESSION:  org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:173)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1361)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:237)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:112)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:627)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:542)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:258)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:86)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:244)
	at org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.__CLR2_4_3harbaz1h6y(TestCheckpoint.java:560)
	at org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking(TestCheckpoint.java:553)


REGRESSION:  org.apache.hadoop.hdfs.server.namenode.TestNNThroughputBenchmark.testNNThroughput

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:173)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1361)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:237)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:112)
	at org.apache.hadoop.hdfs.server.namenode.TestNNThroughputBenchmark.__CLR2_4_3b2i9ur1f1u(TestNNThroughputBenchmark.java:39)
	at org.apache.hadoop.hdfs.server.namenode.TestNNThroughputBenchmark.testNNThroughput(TestNNThroughputBenchmark.java:35)


REGRESSION:  org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatMatchingRPCandHttpPortsThrowException

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:173)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1361)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:237)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:112)
	at org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.__CLR2_4_3b49o261bdv(TestValidateConfigurationSettings.java:49)
	at org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatMatchingRPCandHttpPortsThrowException(TestValidateConfigurationSettings.java:43)


REGRESSION:  org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatDifferentRPCandHttpPortsAreOK

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:173)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1361)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:237)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:112)
	at org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.__CLR2_4_3ihms9r1be5(TestValidateConfigurationSettings.java:71)
	at org.apache.hadoop.hdfs.server.namenode.TestValidateConfigurationSettings.testThatDifferentRPCandHttpPortsAreOK(TestValidateConfigurationSettings.java:66)


FAILED:  org.apache.hadoop.hdfs.TestParallelRead.testParallelRead

Error Message:
Timeout occurred. Please note the time in the report does not reflect the time until the timeout.

Stack Trace:
junit.framework.AssertionFailedError: Timeout occurred. Please note the time in the report does not reflect the time until the timeout.




Hadoop-Hdfs-trunk - Build # 743 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/743/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1074463 lines...]
    [junit] 2011-08-08 13:09:14,223 INFO  mortbay.log (Slf4jLog.java:info(67)) - Stopped SelectChannelConnector@localhost:0
    [junit] 2011-08-08 13:09:14,324 WARN  blockmanagement.BlockManager (BlockManager.java:run(2604)) - ReplicationMonitor thread received InterruptedException.
    [junit] java.lang.InterruptedException: sleep interrupted
    [junit] 	at java.lang.Thread.sleep(Native Method)
    [junit] 	at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager$ReplicationMonitor.run(BlockManager.java:2602)
    [junit] 	at java.lang.Thread.run(Thread.java:619)
    [junit] 2011-08-08 13:09:14,324 INFO  namenode.FSEditLog (FSEditLog.java:endCurrentLogSegment(822)) - Ending log segment 1
    [junit] 2011-08-08 13:09:14,324 WARN  blockmanagement.DecommissionManager (DecommissionManager.java:run(75)) - Monitor interrupted: java.lang.InterruptedException: sleep interrupted
    [junit] 2011-08-08 13:09:14,335 INFO  namenode.FSEditLog (FSEditLog.java:printStatistics(501)) - Number of transactions: 8 Total time for transactions(ms): 1Number of transactions batched in Syncs: 0 Number of syncs: 7 SyncTimes(ms): 40 47 
    [junit] 2011-08-08 13:09:14,336 INFO  ipc.Server (Server.java:stop(1715)) - Stopping server on 38801
    [junit] 2011-08-08 13:09:14,337 INFO  ipc.Server (Server.java:run(1539)) - IPC Server handler 0 on 38801: exiting
    [junit] 2011-08-08 13:09:14,337 INFO  ipc.Server (Server.java:run(505)) - Stopping IPC Server listener on 38801
    [junit] 2011-08-08 13:09:14,337 INFO  ipc.Server (Server.java:run(647)) - Stopping IPC Server Responder
    [junit] 2011-08-08 13:09:14,337 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stop(199)) - Stopping DataNode metrics system...
    [junit] 2011-08-08 13:09:14,337 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics
    [junit] 2011-08-08 13:09:14,338 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source NameNodeActivity
    [junit] 2011-08-08 13:09:14,338 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort38801
    [junit] 2011-08-08 13:09:14,338 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort38801
    [junit] 2011-08-08 13:09:14,338 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source FSNamesystem
    [junit] 2011-08-08 13:09:14,339 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort47737
    [junit] 2011-08-08 13:09:14,339 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort47737
    [junit] 2011-08-08 13:09:14,339 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-1
    [junit] 2011-08-08 13:09:14,339 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-60216
    [junit] 2011-08-08 13:09:14,340 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort38182
    [junit] 2011-08-08 13:09:14,340 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort38182
    [junit] 2011-08-08 13:09:14,340 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-2
    [junit] 2011-08-08 13:09:14,340 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-40256
    [junit] 2011-08-08 13:09:14,341 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort38609
    [junit] 2011-08-08 13:09:14,341 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort38609
    [junit] 2011-08-08 13:09:14,341 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-3
    [junit] 2011-08-08 13:09:14,341 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-46206
    [junit] 2011-08-08 13:09:14,341 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort53288
    [junit] 2011-08-08 13:09:14,342 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort53288
    [junit] 2011-08-08 13:09:14,342 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-4
    [junit] 2011-08-08 13:09:14,342 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-58981
    [junit] 2011-08-08 13:09:14,342 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stop(205)) - DataNode metrics system stopped.
    [junit] 2011-08-08 13:09:14,343 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:shutdown(553)) - DataNode metrics system shutdown complete.
    [junit] Tests run: 16, Failures: 0, Errors: 0, Time elapsed: 120.038 sec

checkfailure:

-run-test-hdfs-fault-inject-withtestcaseonly:

run-test-hdfs-fault-inject:

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build.xml:777: Tests failed!

Total time: 97 minutes 30 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Updating HDFS-2228
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.hdfs.TestParallelRead.testParallelRead

Error Message:
Timeout occurred. Please note the time in the report does not reflect the time until the timeout.

Stack Trace:
junit.framework.AssertionFailedError: Timeout occurred. Please note the time in the report does not reflect the time until the timeout.




Hadoop-Hdfs-trunk - Build # 742 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/742/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1096622 lines...]
    [junit] 2011-08-07 13:26:49,247 INFO  datanode.FSDatasetAsyncDiskService (FSDatasetAsyncDiskService.java:shutdown(142)) - All async disk service threads have been shut down.
    [junit] 2011-08-07 13:26:49,248 INFO  mortbay.log (Slf4jLog.java:info(67)) - Stopped SelectChannelConnector@localhost:0
    [junit] 2011-08-07 13:26:49,349 WARN  blockmanagement.DecommissionManager (DecommissionManager.java:run(75)) - Monitor interrupted: java.lang.InterruptedException: sleep interrupted
    [junit] 2011-08-07 13:26:49,349 WARN  blockmanagement.BlockManager (BlockManager.java:run(2370)) - ReplicationMonitor thread received InterruptedException.
    [junit] java.lang.InterruptedException: sleep interrupted
    [junit] 	at java.lang.Thread.sleep(Native Method)
    [junit] 	at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager$ReplicationMonitor.run(BlockManager.java:2368)
    [junit] 	at java.lang.Thread.run(Thread.java:619)
    [junit] 2011-08-07 13:26:49,349 INFO  namenode.FSEditLog (FSEditLog.java:endCurrentLogSegment(822)) - Ending log segment 1
    [junit] 2011-08-07 13:26:49,357 INFO  namenode.FSEditLog (FSEditLog.java:printStatistics(501)) - Number of transactions: 8 Total time for transactions(ms): 1Number of transactions batched in Syncs: 0 Number of syncs: 7 SyncTimes(ms): 43 38 
    [junit] 2011-08-07 13:26:49,359 INFO  ipc.Server (Server.java:stop(1715)) - Stopping server on 53315
    [junit] 2011-08-07 13:26:49,359 INFO  ipc.Server (Server.java:run(1539)) - IPC Server handler 0 on 53315: exiting
    [junit] 2011-08-07 13:26:49,359 INFO  ipc.Server (Server.java:run(505)) - Stopping IPC Server listener on 53315
    [junit] 2011-08-07 13:26:49,359 INFO  ipc.Server (Server.java:run(647)) - Stopping IPC Server Responder
    [junit] 2011-08-07 13:26:49,359 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stop(199)) - Stopping DataNode metrics system...
    [junit] 2011-08-07 13:26:49,360 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics
    [junit] 2011-08-07 13:26:49,360 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source NameNodeActivity
    [junit] 2011-08-07 13:26:49,360 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort53315
    [junit] 2011-08-07 13:26:49,360 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort53315
    [junit] 2011-08-07 13:26:49,361 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source FSNamesystem
    [junit] 2011-08-07 13:26:49,361 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort53121
    [junit] 2011-08-07 13:26:49,361 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort53121
    [junit] 2011-08-07 13:26:49,361 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-1
    [junit] 2011-08-07 13:26:49,362 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-49020
    [junit] 2011-08-07 13:26:49,362 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort38528
    [junit] 2011-08-07 13:26:49,362 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort38528
    [junit] 2011-08-07 13:26:49,362 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-2
    [junit] 2011-08-07 13:26:49,363 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-49327
    [junit] 2011-08-07 13:26:49,363 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort59359
    [junit] 2011-08-07 13:26:49,363 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort59359
    [junit] 2011-08-07 13:26:49,363 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-3
    [junit] 2011-08-07 13:26:49,364 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-56851
    [junit] 2011-08-07 13:26:49,364 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort56988
    [junit] 2011-08-07 13:26:49,364 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort56988
    [junit] 2011-08-07 13:26:49,364 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-4
    [junit] 2011-08-07 13:26:49,365 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-43917
    [junit] 2011-08-07 13:26:49,365 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stop(205)) - DataNode metrics system stopped.
    [junit] 2011-08-07 13:26:49,365 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:shutdown(553)) - DataNode metrics system shutdown complete.
    [junit] Tests run: 16, Failures: 0, Errors: 0, Time elapsed: 121.63 sec

checkfailure:

-run-test-hdfs-fault-inject-withtestcaseonly:

run-test-hdfs-fault-inject:

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build.xml:777: Tests failed!

Total time: 114 minutes 25 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
REGRESSION:  org.apache.hadoop.hdfs.TestDFSRemove.testRemove

Error Message:
All blocks should be gone. start=98304 max=102304 final=100764 expected:<98304> but was:<100764>

Stack Trace:
junit.framework.AssertionFailedError: All blocks should be gone. start=98304 max=102304 final=100764 expected:<98304> but was:<100764>
	at org.apache.hadoop.hdfs.TestDFSRemove.__CLR2_4_3ej1cn113kh(TestDFSRemove.java:82)
	at org.apache.hadoop.hdfs.TestDFSRemove.testRemove(TestDFSRemove.java:57)


REGRESSION:  org.apache.hadoop.hdfs.TestParallelRead.testParallelRead

Error Message:
Timeout occurred. Please note the time in the report does not reflect the time until the timeout.

Stack Trace:
junit.framework.AssertionFailedError: Timeout occurred. Please note the time in the report does not reflect the time until the timeout.


REGRESSION:  org.apache.hadoop.hdfs.TestReplaceDatanodeOnFailure.testReplaceDatanodeOnFailure

Error Message:
Failed to add a datanode: nodes.length != original.length + 1, nodes=[127.0.0.1:60615, 127.0.0.1:52064], original=[127.0.0.1:60615, 127.0.0.1:52064]

Stack Trace:
java.io.IOException: Failed to add a datanode: nodes.length != original.length + 1, nodes=[127.0.0.1:60615, 127.0.0.1:52064], original=[127.0.0.1:60615, 127.0.0.1:52064]
	at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.findNewDatanode(DFSOutputStream.java:767)
	at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.addDatanode2ExistingPipeline(DFSOutputStream.java:823)
	at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.setupPipelineForAppendOrRecovery(DFSOutputStream.java:919)
	at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.processDatanodeError(DFSOutputStream.java:730)
	at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:416)


FAILED:  org.apache.hadoop.hdfs.security.token.block.TestBlockToken.testBlockTokenRpcLeak

Error Message:
Call to localhost/127.0.0.1:47874 failed on socket timeout exception: java.net.SocketTimeoutException: 1000 millis timeout while waiting for channel to be ready for read. ch : java.nio.channels.SocketChannel[connected local=/127.0.0.1:54293 remote=localhost/127.0.0.1:47874]

Stack Trace:
java.net.SocketTimeoutException: Call to localhost/127.0.0.1:47874 failed on socket timeout exception: java.net.SocketTimeoutException: 1000 millis timeout while waiting for channel to be ready for read. ch : java.nio.channels.SocketChannel[connected local=/127.0.0.1:54293 remote=localhost/127.0.0.1:47874]
	at org.apache.hadoop.ipc.Client.wrapException(Client.java:1085)
	at org.apache.hadoop.ipc.Client.call(Client.java:1057)
	at org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:193)
	at $Proxy7.getReplicaVisibleLength(Unknown Source)
	at org.apache.hadoop.hdfs.security.token.block.TestBlockToken.__CLR2_4_313r7v311rh(TestBlockToken.java:298)
	at org.apache.hadoop.hdfs.security.token.block.TestBlockToken.testBlockTokenRpcLeak(TestBlockToken.java:262)
Caused by: java.net.SocketTimeoutException: 1000 millis timeout while waiting for channel to be ready for read. ch : java.nio.channels.SocketChannel[connected local=/127.0.0.1:54293 remote=localhost/127.0.0.1:47874]
	at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
	at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:159)
	at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:132)
	at java.io.DataInputStream.read(DataInputStream.java:132)
	at org.apache.hadoop.security.SaslInputStream.read(SaslInputStream.java:243)
	at java.io.FilterInputStream.read(FilterInputStream.java:116)
	at org.apache.hadoop.ipc.Client$Connection$PingInputStream.read(Client.java:369)
	at java.io.BufferedInputStream.fill(BufferedInputStream.java:218)
	at java.io.BufferedInputStream.read(BufferedInputStream.java:237)
	at java.io.DataInputStream.readInt(DataInputStream.java:370)
	at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:786)
	at org.apache.hadoop.ipc.Client$Connection.run(Client.java:724)




Hadoop-Hdfs-trunk - Build # 741 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/741/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1069428 lines...]
    [junit] 2011-08-06 13:26:22,404 INFO  datanode.FSDatasetAsyncDiskService (FSDatasetAsyncDiskService.java:shutdown(142)) - All async disk service threads have been shut down.
    [junit] 2011-08-06 13:26:22,404 INFO  mortbay.log (Slf4jLog.java:info(67)) - Stopped SelectChannelConnector@localhost:0
    [junit] 2011-08-06 13:26:22,505 WARN  blockmanagement.DecommissionManager (DecommissionManager.java:run(75)) - Monitor interrupted: java.lang.InterruptedException: sleep interrupted
    [junit] 2011-08-06 13:26:22,506 WARN  blockmanagement.BlockManager (BlockManager.java:run(2370)) - ReplicationMonitor thread received InterruptedException.
    [junit] java.lang.InterruptedException: sleep interrupted
    [junit] 	at java.lang.Thread.sleep(Native Method)
    [junit] 	at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager$ReplicationMonitor.run(BlockManager.java:2368)
    [junit] 	at java.lang.Thread.run(Thread.java:619)
    [junit] 2011-08-06 13:26:22,506 INFO  namenode.FSEditLog (FSEditLog.java:endCurrentLogSegment(822)) - Ending log segment 1
    [junit] 2011-08-06 13:26:22,516 INFO  namenode.FSEditLog (FSEditLog.java:printStatistics(501)) - Number of transactions: 8 Total time for transactions(ms): 0Number of transactions batched in Syncs: 0 Number of syncs: 7 SyncTimes(ms): 46 39 
    [junit] 2011-08-06 13:26:22,517 INFO  ipc.Server (Server.java:stop(1715)) - Stopping server on 38144
    [junit] 2011-08-06 13:26:22,518 INFO  ipc.Server (Server.java:run(1539)) - IPC Server handler 0 on 38144: exiting
    [junit] 2011-08-06 13:26:22,518 INFO  ipc.Server (Server.java:run(505)) - Stopping IPC Server listener on 38144
    [junit] 2011-08-06 13:26:22,518 INFO  ipc.Server (Server.java:run(647)) - Stopping IPC Server Responder
    [junit] 2011-08-06 13:26:22,518 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stop(199)) - Stopping DataNode metrics system...
    [junit] 2011-08-06 13:26:22,518 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics
    [junit] 2011-08-06 13:26:22,518 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source NameNodeActivity
    [junit] 2011-08-06 13:26:22,519 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort38144
    [junit] 2011-08-06 13:26:22,519 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort38144
    [junit] 2011-08-06 13:26:22,519 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source FSNamesystem
    [junit] 2011-08-06 13:26:22,519 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort37245
    [junit] 2011-08-06 13:26:22,520 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort37245
    [junit] 2011-08-06 13:26:22,520 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-1
    [junit] 2011-08-06 13:26:22,520 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-35678
    [junit] 2011-08-06 13:26:22,520 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort58168
    [junit] 2011-08-06 13:26:22,521 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort58168
    [junit] 2011-08-06 13:26:22,521 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-2
    [junit] 2011-08-06 13:26:22,521 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-57663
    [junit] 2011-08-06 13:26:22,521 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort60389
    [junit] 2011-08-06 13:26:22,522 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort60389
    [junit] 2011-08-06 13:26:22,522 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-3
    [junit] 2011-08-06 13:26:22,522 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-46874
    [junit] 2011-08-06 13:26:22,522 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort50478
    [junit] 2011-08-06 13:26:22,523 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort50478
    [junit] 2011-08-06 13:26:22,523 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-4
    [junit] 2011-08-06 13:26:22,523 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-51241
    [junit] 2011-08-06 13:26:22,523 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stop(205)) - DataNode metrics system stopped.
    [junit] 2011-08-06 13:26:22,524 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:shutdown(553)) - DataNode metrics system shutdown complete.
    [junit] Tests run: 16, Failures: 0, Errors: 0, Time elapsed: 123.329 sec

checkfailure:

-run-test-hdfs-fault-inject-withtestcaseonly:

run-test-hdfs-fault-inject:

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build.xml:777: Tests failed!

Total time: 114 minutes 41 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
REGRESSION:  org.apache.hadoop.hdfs.security.token.block.TestBlockToken.testBlockTokenRpcLeak

Error Message:
Call to localhost/127.0.0.1:57096 failed on local exception: java.io.IOException: java.net.SocketTimeoutException: 1000 millis timeout while waiting for channel to be ready for read. ch : java.nio.channels.SocketChannel[connected local=/127.0.0.1:52139 remote=localhost/127.0.0.1:57096]

Stack Trace:
java.io.IOException: Call to localhost/127.0.0.1:57096 failed on local exception: java.io.IOException: java.net.SocketTimeoutException: 1000 millis timeout while waiting for channel to be ready for read. ch : java.nio.channels.SocketChannel[connected local=/127.0.0.1:52139 remote=localhost/127.0.0.1:57096]
	at org.apache.hadoop.ipc.Client.wrapException(Client.java:1089)
	at org.apache.hadoop.ipc.Client.call(Client.java:1057)
	at org.apache.hadoop.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:193)
	at $Proxy7.getReplicaVisibleLength(Unknown Source)
	at org.apache.hadoop.hdfs.security.token.block.TestBlockToken.__CLR2_4_313r7v311rh(TestBlockToken.java:298)
	at org.apache.hadoop.hdfs.security.token.block.TestBlockToken.testBlockTokenRpcLeak(TestBlockToken.java:262)
Caused by: java.io.IOException: java.net.SocketTimeoutException: 1000 millis timeout while waiting for channel to be ready for read. ch : java.nio.channels.SocketChannel[connected local=/127.0.0.1:52139 remote=localhost/127.0.0.1:57096]
	at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:504)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:396)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1135)
	at org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:468)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:552)
	at org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:207)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1190)
	at org.apache.hadoop.ipc.Client.call(Client.java:1034)
Caused by: java.net.SocketTimeoutException: 1000 millis timeout while waiting for channel to be ready for read. ch : java.nio.channels.SocketChannel[connected local=/127.0.0.1:52139 remote=localhost/127.0.0.1:57096]
	at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:164)
	at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:159)
	at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.java:132)
	at java.io.BufferedInputStream.fill(BufferedInputStream.java:218)
	at java.io.BufferedInputStream.read(BufferedInputStream.java:237)
	at java.io.DataInputStream.readInt(DataInputStream.java:370)
	at org.apache.hadoop.security.SaslRpcClient.readStatus(SaslRpcClient.java:109)
	at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:173)
	at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:406)
	at org.apache.hadoop.ipc.Client$Connection.access$1200(Client.java:207)
	at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:545)
	at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:542)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:396)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1135)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:541)




Hadoop-Hdfs-trunk - Build # 740 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/740/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1028834 lines...]
    [junit] 2011-08-06 09:43:36,553 INFO  mortbay.log (Slf4jLog.java:info(67)) - Stopped SelectChannelConnector@localhost:0
    [junit] 2011-08-06 09:43:36,654 WARN  blockmanagement.DecommissionManager (DecommissionManager.java:run(75)) - Monitor interrupted: java.lang.InterruptedException: sleep interrupted
    [junit] 2011-08-06 09:43:36,654 WARN  blockmanagement.BlockManager (BlockManager.java:run(2370)) - ReplicationMonitor thread received InterruptedException.
    [junit] java.lang.InterruptedException: sleep interrupted
    [junit] 	at java.lang.Thread.sleep(Native Method)
    [junit] 	at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager$ReplicationMonitor.run(BlockManager.java:2368)
    [junit] 	at java.lang.Thread.run(Thread.java:619)
    [junit] 2011-08-06 09:43:36,655 INFO  namenode.FSEditLog (FSEditLog.java:endCurrentLogSegment(822)) - Ending log segment 1
    [junit] 2011-08-06 09:43:36,842 INFO  namenode.FSEditLog (FSEditLog.java:printStatistics(501)) - Number of transactions: 8 Total time for transactions(ms): 0Number of transactions batched in Syncs: 0 Number of syncs: 7 SyncTimes(ms): 220 42 
    [junit] 2011-08-06 09:43:36,844 INFO  ipc.Server (Server.java:stop(1715)) - Stopping server on 57023
    [junit] 2011-08-06 09:43:36,844 INFO  ipc.Server (Server.java:run(1539)) - IPC Server handler 0 on 57023: exiting
    [junit] 2011-08-06 09:43:36,844 INFO  ipc.Server (Server.java:run(505)) - Stopping IPC Server listener on 57023
    [junit] 2011-08-06 09:43:36,844 INFO  ipc.Server (Server.java:run(647)) - Stopping IPC Server Responder
    [junit] 2011-08-06 09:43:36,844 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stop(199)) - Stopping DataNode metrics system...
    [junit] 2011-08-06 09:43:36,845 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics
    [junit] 2011-08-06 09:43:36,845 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source NameNodeActivity
    [junit] 2011-08-06 09:43:36,845 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort57023
    [junit] 2011-08-06 09:43:36,845 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort57023
    [junit] 2011-08-06 09:43:36,846 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source FSNamesystem
    [junit] 2011-08-06 09:43:36,846 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort36157
    [junit] 2011-08-06 09:43:36,846 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort36157
    [junit] 2011-08-06 09:43:36,846 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-1
    [junit] 2011-08-06 09:43:36,847 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-50861
    [junit] 2011-08-06 09:43:36,847 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort34446
    [junit] 2011-08-06 09:43:36,847 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort34446
    [junit] 2011-08-06 09:43:36,847 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-2
    [junit] 2011-08-06 09:43:36,848 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-32950
    [junit] 2011-08-06 09:43:36,848 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort57889
    [junit] 2011-08-06 09:43:36,848 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort57889
    [junit] 2011-08-06 09:43:36,848 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-3
    [junit] 2011-08-06 09:43:36,849 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-59346
    [junit] 2011-08-06 09:43:36,849 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcActivityForPort52004
    [junit] 2011-08-06 09:43:36,849 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source RpcDetailedActivityForPort52004
    [junit] 2011-08-06 09:43:36,849 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source JvmMetrics-4
    [junit] 2011-08-06 09:43:36,850 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stopSources(408)) - Stopping metrics source DataNodeActivity-janus.apache.org-37271
    [junit] 2011-08-06 09:43:36,850 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:stop(205)) - DataNode metrics system stopped.
    [junit] 2011-08-06 09:43:36,850 INFO  impl.MetricsSystemImpl (MetricsSystemImpl.java:shutdown(553)) - DataNode metrics system shutdown complete.
    [junit] Tests run: 16, Failures: 0, Errors: 0, Time elapsed: 120.816 sec

checkfailure:

-run-test-hdfs-fault-inject-withtestcaseonly:

run-test-hdfs-fault-inject:

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/build.xml:777: Tests failed!

Total time: 114 minutes 40 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Updating HDFS-2225
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
REGRESSION:  org.apache.hadoop.hdfs.TestParallelRead.testParallelRead

Error Message:
Timeout occurred. Please note the time in the report does not reflect the time until the timeout.

Stack Trace:
junit.framework.AssertionFailedError: Timeout occurred. Please note the time in the report does not reflect the time until the timeout.