You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-dev@hadoop.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2011/08/10 04:17:53 UTC

Hadoop-Hdfs-trunk-Commit - Build # 822 - Still Failing

See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/822/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 2448 lines...]
    [junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.095 sec
    [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestOverReplicatedBlocks
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 3.889 sec
    [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestPendingReplication
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 7.219 sec
    [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestReplicationPolicy
    [junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 0 sec
    [junit] Test org.apache.hadoop.hdfs.server.blockmanagement.TestReplicationPolicy FAILED
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestBlockReplacement
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 21.672 sec
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestDirectoryScanner
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 4.394 sec
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestDiskError
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 9.478 sec
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestInterDatanodeProtocol
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 5.573 sec
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestSimulatedFSDataset
    [junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.691 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestBackupNode
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 20.872 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestCheckpoint
    [junit] Tests run: 27, Failures: 0, Errors: 1, Time elapsed: 68.872 sec
    [junit] Test org.apache.hadoop.hdfs.server.namenode.TestCheckpoint FAILED
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestEditLog
    [junit] Tests run: 13, Failures: 0, Errors: 0, Time elapsed: 23.31 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestFileLimit
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 4.587 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestNamenodeCapacityReport
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 2.772 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestSafeMode
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 4.163 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestStartup
    [junit] Tests run: 6, Failures: 0, Errors: 0, Time elapsed: 22.839 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestStorageRestore
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 15.82 sec
    [junit] Running org.apache.hadoop.net.TestNetworkTopology
    [junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.127 sec
    [junit] Running org.apache.hadoop.security.TestPermission
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 5.689 sec

checkfailure:
    [touch] Creating /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build/test/testsfailed

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:733: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:690: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:758: Tests failed!

Total time: 11 minutes 34 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Updating HDFS-2239
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts

Error Message:
Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible.

Stack Trace:
org.apache.hadoop.hdfs.server.common.InconsistentFSStateException: Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible.
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$CheckpointStorage.recoverCreate(SecondaryNameNode.java:801)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:222)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:175)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:168)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.canStartSecondaryNode(TestHDFSServerPorts.java:224)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.__CLR2_4_3vpy47p10ou(TestHDFSServerPorts.java:350)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts(TestHDFSServerPorts.java:339)


FAILED:  org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1362)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:237)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:112)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:626)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:541)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:257)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:85)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:243)
	at org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.__CLR2_4_3harbaz16jr(TestCheckpoint.java:560)
	at org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking(TestCheckpoint.java:553)




Hadoop-Hdfs-trunk-Commit - Build # 840 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/840/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1286 lines...]
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:117 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/data");
     [iajc]                                 ^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:118 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String encodedPath = ServletUtil.getRawPath(request, "/data");
     [iajc]                                        
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:90 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/listPaths");
     [iajc]                                 ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:138 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String filePath = ServletUtil.getDecodedPath(request, "/listPaths");
     [iajc]                                     ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:65 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/streamFile");
     [iajc]                                 ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:66 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String rawPath = ServletUtil.getRawPath(request, "/streamFile");
     [iajc]                                    ^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:67 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:60 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:50 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:43 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] 
     [iajc] 18 errors, 4 warnings

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:222: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:203: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:90: compile errors: 18

Total time: 59 seconds


======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================


mv: cannot stat `build/*.tar.gz': No such file or directory
mv: cannot stat `build/test/findbugs': No such file or directory
mv: cannot stat `build/docs/api': No such file or directory
Build Failed
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording fingerprints
Updating HDFS-2260
Recording test results
Publishing Javadoc
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk-Commit - Build # 839 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/839/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1284 lines...]
     [iajc]                      ^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:117 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/data");
     [iajc]                                 ^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:118 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String encodedPath = ServletUtil.getRawPath(request, "/data");
     [iajc]                                        
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:90 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/listPaths");
     [iajc]                                 ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:138 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String filePath = ServletUtil.getDecodedPath(request, "/listPaths");
     [iajc]                                     ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:65 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/streamFile");
     [iajc]                                 ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:66 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String rawPath = ServletUtil.getRawPath(request, "/streamFile");
     [iajc]                                    ^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:67 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:60 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:50 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:43 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] 
     [iajc] 18 errors, 4 warnings

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:222: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:203: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:90: compile errors: 18

Total time: 1 minute 0 seconds


======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================


mv: cannot stat `build/*.tar.gz': No such file or directory
mv: cannot stat `build/test/findbugs': No such file or directory
mv: cannot stat `build/docs/api': No such file or directory
Build Failed
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording fingerprints
Recording test results
Publishing Javadoc
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk-Commit - Build # 838 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/838/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1284 lines...]
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:117 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/data");
     [iajc]                                 ^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:118 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String encodedPath = ServletUtil.getRawPath(request, "/data");
     [iajc]                                        
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:90 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/listPaths");
     [iajc]                                 ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:138 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String filePath = ServletUtil.getDecodedPath(request, "/listPaths");
     [iajc]                                     ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:65 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/streamFile");
     [iajc]                                 ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:66 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String rawPath = ServletUtil.getRawPath(request, "/streamFile");
     [iajc]                                    ^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:67 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:60 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:50 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:43 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] 
     [iajc] 18 errors, 4 warnings

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:222: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:203: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:90: compile errors: 18

Total time: 57 seconds


======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================


mv: cannot stat `build/*.tar.gz': No such file or directory
mv: cannot stat `build/test/findbugs': No such file or directory
mv: cannot stat `build/docs/api': No such file or directory
Build Failed
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording fingerprints
Updating HDFS-2265
Recording test results
Publishing Javadoc
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk-Commit - Build # 837 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/837/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1283 lines...]
     [iajc] "ugi=" + ServletUtil.encodeQueryValue(ugi.getShortUserName()) +
     [iajc]                      ^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:117 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/data");
     [iajc]                                 ^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:118 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String encodedPath = ServletUtil.getRawPath(request, "/data");
     [iajc]                                        
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:90 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/listPaths");
     [iajc]                                 ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:138 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String filePath = ServletUtil.getDecodedPath(request, "/listPaths");
     [iajc]                                     ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:65 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/streamFile");
     [iajc]                                 ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:66 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String rawPath = ServletUtil.getRawPath(request, "/streamFile");
     [iajc]                                    ^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:67 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:60 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:50 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:43 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] 
     [iajc] 18 errors, 4 warnings

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:222: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:203: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:90: compile errors: 18

Total time: 58 seconds


======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================


mv: cannot stat `build/*.tar.gz': No such file or directory
mv: cannot stat `build/test/findbugs': No such file or directory
mv: cannot stat `build/docs/api': No such file or directory
Build Failed
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording fingerprints
Recording test results
Publishing Javadoc
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk-Commit - Build # 836 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/836/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1283 lines...]
     [iajc] "ugi=" + ServletUtil.encodeQueryValue(ugi.getShortUserName()) +
     [iajc]                      ^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:117 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/data");
     [iajc]                                 ^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:118 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String encodedPath = ServletUtil.getRawPath(request, "/data");
     [iajc]                                        
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:90 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/listPaths");
     [iajc]                                 ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:138 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String filePath = ServletUtil.getDecodedPath(request, "/listPaths");
     [iajc]                                     ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:65 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/streamFile");
     [iajc]                                 ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:66 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String rawPath = ServletUtil.getRawPath(request, "/streamFile");
     [iajc]                                    ^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:67 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:60 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:50 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:43 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] 
     [iajc] 18 errors, 4 warnings

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:222: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:203: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:90: compile errors: 18

Total time: 1 minute 0 seconds


======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================


mv: cannot stat `build/*.tar.gz': No such file or directory
mv: cannot stat `build/test/findbugs': No such file or directory
mv: cannot stat `build/docs/api': No such file or directory
Build Failed
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording fingerprints
Recording test results
Publishing Javadoc
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk-Commit - Build # 835 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/835/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1275 lines...]
     [iajc]                      ^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:117 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/data");
     [iajc]                                 ^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:118 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String encodedPath = ServletUtil.getRawPath(request, "/data");
     [iajc]                                        
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:90 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/listPaths");
     [iajc]                                 ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:138 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String filePath = ServletUtil.getDecodedPath(request, "/listPaths");
     [iajc]                                     ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:65 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/streamFile");
     [iajc]                                 ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:66 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String rawPath = ServletUtil.getRawPath(request, "/streamFile");
     [iajc]                                    ^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:67 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:60 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:50 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:43 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] 
     [iajc] 18 errors, 4 warnings

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:222: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:203: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:90: compile errors: 18

Total time: 55 seconds


======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================


mv: cannot stat `build/*.tar.gz': No such file or directory
mv: cannot stat `build/test/findbugs': No such file or directory
mv: cannot stat `build/docs/api': No such file or directory
Build Failed
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Updating HDFS-2233
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk-Commit - Build # 834 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/834/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1274 lines...]
     [iajc] "ugi=" + ServletUtil.encodeQueryValue(ugi.getShortUserName()) +
     [iajc]                      ^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:117 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/data");
     [iajc]                                 ^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:118 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String encodedPath = ServletUtil.getRawPath(request, "/data");
     [iajc]                                        
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:90 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/listPaths");
     [iajc]                                 ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:138 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String filePath = ServletUtil.getDecodedPath(request, "/listPaths");
     [iajc]                                     ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:65 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/streamFile");
     [iajc]                                 ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:66 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String rawPath = ServletUtil.getRawPath(request, "/streamFile");
     [iajc]                                    ^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:67 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:60 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:50 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:43 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] 
     [iajc] 18 errors, 4 warnings

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:222: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:203: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:90: compile errors: 18

Total time: 58 seconds


======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================


mv: cannot stat `build/*.tar.gz': No such file or directory
mv: cannot stat `build/test/findbugs': No such file or directory
mv: cannot stat `build/docs/api': No such file or directory
Build Failed
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk-Commit - Build # 833 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/833/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1274 lines...]
     [iajc]                      ^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:117 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/data");
     [iajc]                                 ^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:118 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String encodedPath = ServletUtil.getRawPath(request, "/data");
     [iajc]                                        
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:90 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/listPaths");
     [iajc]                                 ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:138 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String filePath = ServletUtil.getDecodedPath(request, "/listPaths");
     [iajc]                                     ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:65 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/streamFile");
     [iajc]                                 ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:66 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String rawPath = ServletUtil.getRawPath(request, "/streamFile");
     [iajc]                                    ^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:67 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:60 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:50 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:43 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] 
     [iajc] 18 errors, 4 warnings

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:222: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:203: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:90: compile errors: 18

Total time: 1 minute 2 seconds


======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================


mv: cannot stat `build/*.tar.gz': No such file or directory
mv: cannot stat `build/test/findbugs': No such file or directory
mv: cannot stat `build/docs/api': No such file or directory
Build Failed
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Updating HDFS-73
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk-Commit - Build # 832 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/832/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1273 lines...]
     [iajc]                      ^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:117 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/data");
     [iajc]                                 ^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:118 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String encodedPath = ServletUtil.getRawPath(request, "/data");
     [iajc]                                        
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:90 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/listPaths");
     [iajc]                                 ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:138 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String filePath = ServletUtil.getDecodedPath(request, "/listPaths");
     [iajc]                                     ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:65 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/streamFile");
     [iajc]                                 ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:66 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String rawPath = ServletUtil.getRawPath(request, "/streamFile");
     [iajc]                                    ^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:67 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:60 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:50 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:43 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] 
     [iajc] 18 errors, 4 warnings

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:222: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:203: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:90: compile errors: 18

Total time: 58 seconds


======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================


mv: cannot stat `build/*.tar.gz': No such file or directory
mv: cannot stat `build/test/findbugs': No such file or directory
mv: cannot stat `build/docs/api': No such file or directory
Build Failed
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Updating HDFS-2240
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk-Commit - Build # 831 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/831/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1274 lines...]
     [iajc]                      ^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:117 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/data");
     [iajc]                                 ^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:118 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String encodedPath = ServletUtil.getRawPath(request, "/data");
     [iajc]                                        
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:90 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/listPaths");
     [iajc]                                 ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:138 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String filePath = ServletUtil.getDecodedPath(request, "/listPaths");
     [iajc]                                     ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:65 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/streamFile");
     [iajc]                                 ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:66 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String rawPath = ServletUtil.getRawPath(request, "/streamFile");
     [iajc]                                    ^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:67 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:60 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:50 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:43 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] 
     [iajc] 18 errors, 4 warnings

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:222: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:203: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:90: compile errors: 18

Total time: 55 seconds


======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================


mv: cannot stat `build/*.tar.gz': No such file or directory
mv: cannot stat `build/test/findbugs': No such file or directory
mv: cannot stat `build/docs/api': No such file or directory
Build Failed
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Updating HDFS-2186
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk-Commit - Build # 830 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/830/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1274 lines...]
     [iajc]                      ^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:117 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/data");
     [iajc]                                 ^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:118 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String encodedPath = ServletUtil.getRawPath(request, "/data");
     [iajc]                                        
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:90 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/listPaths");
     [iajc]                                 ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:138 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String filePath = ServletUtil.getDecodedPath(request, "/listPaths");
     [iajc]                                     ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:65 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/streamFile");
     [iajc]                                 ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:66 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String rawPath = ServletUtil.getRawPath(request, "/streamFile");
     [iajc]                                    ^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:67 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:60 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:50 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:43 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] 
     [iajc] 18 errors, 4 warnings

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:222: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:203: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:90: compile errors: 18

Total time: 1 minute 2 seconds


======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================


mv: cannot stat `build/*.tar.gz': No such file or directory
mv: cannot stat `build/test/findbugs': No such file or directory
mv: cannot stat `build/docs/api': No such file or directory
Build Failed
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Updating HDFS-2235
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

RE: Hadoop-Hdfs-trunk-Commit - Build # 829 - Still Failing

Posted by Eric Payne <er...@yahoo-inc.com>.
Thanks Eli.

I have resolvers=internal in my $HOME/build.properties file. Is that enough, our should I also put -Dresolvers=internal on the command line?

Thanks,
-Eric

-----Original Message-----
From: Eli Collins [mailto:eli@cloudera.com] 
Sent: Friday, August 12, 2011 12:06 PM
To: Eric Payne
Cc: hdfs-dev@hadoop.apache.org; Tom White
Subject: Re: Hadoop-Hdfs-trunk-Commit - Build # 829 - Still Failing

You need to build hdfs with and -Dresolvers=internal after runing mvn
install -DskipTests in common.

On Fri, Aug 12, 2011 at 9:51 AM, Eric Payne <er...@yahoo-inc.com> wrote:
> I'm seeing this error when I try to build a fresh checkout.
>
> I can get around it by removing the .m2 directory in my $HOME directory and then running 'mvn install -DskipTests' again in trun root.
>
> However, test-patch still gets the error and fails the 'system test framework' build.
>
> -Eric
>
> -----Original Message-----
> From: Alejandro Abdelnur [mailto:tucu@cloudera.com]
> Sent: Friday, August 12, 2011 12:41 AM
> To: Eli Collins
> Cc: hdfs-dev@hadoop.apache.org; Tom White
> Subject: Re: Hadoop-Hdfs-trunk-Commit - Build # 829 - Still Failing
>
> Eli,
>
> I think you are right, I'm pretty sure it is picking up the latest deployed
> snapshot.
>
> I'll discuss with Tom tomorrow morning how to take care of this (once HDFS
> is Mavenized we can easily build/use all latest bits from all modules, still
> some tricks not to run all modules test will have to be done).
>
> Thxs.
>
> Alejandro
>
> On Thu, Aug 11, 2011 at 10:20 PM, Eli Collins <el...@cloudera.com> wrote:
>
>> Tucu and co - does hdfs build the latest common or does it try to
>> resolve against the latest deployed common artifact?
>> Looks like hudson-test-patch doesn't pick up on the latest common build.
>>
>>
>>
>> On Thu, Aug 11, 2011 at 10:11 PM, Apache Jenkins Server
>> <je...@builds.apache.org> wrote:
>> > See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/829/
>> >
>> >
>> ###################################################################################
>> > ########################## LAST 60 LINES OF THE CONSOLE
>> ###########################
>> > [...truncated 1273 lines...]
>> >     [iajc]                      ^^^^^^^
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:117
>> [error] The method getDecodedPath(HttpServletRequest, String) is undefined
>> for the type ServletUtil
>> >     [iajc] final String path = ServletUtil.getDecodedPath(request,
>> "/data");
>> >     [iajc]                                 ^^^
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:118
>> [error] The method getRawPath(HttpServletRequest, String) is undefined for
>> the type ServletUtil
>> >     [iajc] final String encodedPath = ServletUtil.getRawPath(request,
>> "/data");
>> >     [iajc]
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:90
>> [error] The method getDecodedPath(HttpServletRequest, String) is undefined
>> for the type ServletUtil
>> >     [iajc] final String path = ServletUtil.getDecodedPath(request,
>> "/listPaths");
>> >     [iajc]                                 ^^^^^^^^^
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:138
>> [error] The method getDecodedPath(HttpServletRequest, String) is undefined
>> for the type ServletUtil
>> >     [iajc] final String filePath = ServletUtil.getDecodedPath(request,
>> "/listPaths");
>> >     [iajc]                                     ^^^^^^^^^
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:65
>> [error] The method getDecodedPath(HttpServletRequest, String) is undefined
>> for the type ServletUtil
>> >     [iajc] final String path = ServletUtil.getDecodedPath(request,
>> "/streamFile");
>> >     [iajc]                                 ^^^^^^^^^
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:66
>> [error] The method getRawPath(HttpServletRequest, String) is undefined for
>> the type ServletUtil
>> >     [iajc] final String rawPath = ServletUtil.getRawPath(request,
>> "/streamFile");
>> >     [iajc]                                    ^^^^^
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:67
>> [warning] advice defined in
>> org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied
>> [Xlint:adviceDidNotMatch]
>> >     [iajc]
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:60
>> [warning] advice defined in
>> org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied
>> [Xlint:adviceDidNotMatch]
>> >     [iajc]
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:50
>> [warning] advice defined in
>> org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied
>> [Xlint:adviceDidNotMatch]
>> >     [iajc]
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:43
>> [warning] advice defined in
>> org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied
>> [Xlint:adviceDidNotMatch]
>> >     [iajc]
>> >     [iajc]
>> >     [iajc] 18 errors, 4 warnings
>> >
>> > BUILD FAILED
>> >
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:222:
>> The following error occurred while executing this line:
>> >
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:203:
>> The following error occurred while executing this line:
>> >
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:90:
>> compile errors: 18
>> >
>> > Total time: 55 seconds
>> >
>> >
>> > ======================================================================
>> > ======================================================================
>> > STORE: saving artifacts
>> > ======================================================================
>> > ======================================================================
>> >
>> >
>> > mv: cannot stat `build/*.tar.gz': No such file or directory
>> > mv: cannot stat `build/test/findbugs': No such file or directory
>> > mv: cannot stat `build/docs/api': No such file or directory
>> > Build Failed
>> > [FINDBUGS] Skipping publisher since build result is FAILURE
>> > Archiving artifacts
>> > Publishing Clover coverage report...
>> > No Clover report will be published due to a Build Failure
>> > Recording test results
>> > Publishing Javadoc
>> > Recording fingerprints
>> > Updating HDFS-2235
>> > Email was triggered for: Failure
>> > Sending email for trigger: Failure
>> >
>> >
>> >
>> >
>> ###################################################################################
>> > ############################## FAILED TESTS (if any)
>> ##############################
>> > No tests ran.
>> >
>>
>

Re: Hadoop-Hdfs-trunk-Commit - Build # 829 - Still Failing

Posted by Eli Collins <el...@cloudera.com>.
You need to build hdfs with and -Dresolvers=internal after runing mvn
install -DskipTests in common.

On Fri, Aug 12, 2011 at 9:51 AM, Eric Payne <er...@yahoo-inc.com> wrote:
> I'm seeing this error when I try to build a fresh checkout.
>
> I can get around it by removing the .m2 directory in my $HOME directory and then running 'mvn install -DskipTests' again in trun root.
>
> However, test-patch still gets the error and fails the 'system test framework' build.
>
> -Eric
>
> -----Original Message-----
> From: Alejandro Abdelnur [mailto:tucu@cloudera.com]
> Sent: Friday, August 12, 2011 12:41 AM
> To: Eli Collins
> Cc: hdfs-dev@hadoop.apache.org; Tom White
> Subject: Re: Hadoop-Hdfs-trunk-Commit - Build # 829 - Still Failing
>
> Eli,
>
> I think you are right, I'm pretty sure it is picking up the latest deployed
> snapshot.
>
> I'll discuss with Tom tomorrow morning how to take care of this (once HDFS
> is Mavenized we can easily build/use all latest bits from all modules, still
> some tricks not to run all modules test will have to be done).
>
> Thxs.
>
> Alejandro
>
> On Thu, Aug 11, 2011 at 10:20 PM, Eli Collins <el...@cloudera.com> wrote:
>
>> Tucu and co - does hdfs build the latest common or does it try to
>> resolve against the latest deployed common artifact?
>> Looks like hudson-test-patch doesn't pick up on the latest common build.
>>
>>
>>
>> On Thu, Aug 11, 2011 at 10:11 PM, Apache Jenkins Server
>> <je...@builds.apache.org> wrote:
>> > See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/829/
>> >
>> >
>> ###################################################################################
>> > ########################## LAST 60 LINES OF THE CONSOLE
>> ###########################
>> > [...truncated 1273 lines...]
>> >     [iajc]                      ^^^^^^^
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:117
>> [error] The method getDecodedPath(HttpServletRequest, String) is undefined
>> for the type ServletUtil
>> >     [iajc] final String path = ServletUtil.getDecodedPath(request,
>> "/data");
>> >     [iajc]                                 ^^^
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:118
>> [error] The method getRawPath(HttpServletRequest, String) is undefined for
>> the type ServletUtil
>> >     [iajc] final String encodedPath = ServletUtil.getRawPath(request,
>> "/data");
>> >     [iajc]
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:90
>> [error] The method getDecodedPath(HttpServletRequest, String) is undefined
>> for the type ServletUtil
>> >     [iajc] final String path = ServletUtil.getDecodedPath(request,
>> "/listPaths");
>> >     [iajc]                                 ^^^^^^^^^
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:138
>> [error] The method getDecodedPath(HttpServletRequest, String) is undefined
>> for the type ServletUtil
>> >     [iajc] final String filePath = ServletUtil.getDecodedPath(request,
>> "/listPaths");
>> >     [iajc]                                     ^^^^^^^^^
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:65
>> [error] The method getDecodedPath(HttpServletRequest, String) is undefined
>> for the type ServletUtil
>> >     [iajc] final String path = ServletUtil.getDecodedPath(request,
>> "/streamFile");
>> >     [iajc]                                 ^^^^^^^^^
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:66
>> [error] The method getRawPath(HttpServletRequest, String) is undefined for
>> the type ServletUtil
>> >     [iajc] final String rawPath = ServletUtil.getRawPath(request,
>> "/streamFile");
>> >     [iajc]                                    ^^^^^
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:67
>> [warning] advice defined in
>> org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied
>> [Xlint:adviceDidNotMatch]
>> >     [iajc]
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:60
>> [warning] advice defined in
>> org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied
>> [Xlint:adviceDidNotMatch]
>> >     [iajc]
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:50
>> [warning] advice defined in
>> org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied
>> [Xlint:adviceDidNotMatch]
>> >     [iajc]
>> >     [iajc]
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:43
>> [warning] advice defined in
>> org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied
>> [Xlint:adviceDidNotMatch]
>> >     [iajc]
>> >     [iajc]
>> >     [iajc] 18 errors, 4 warnings
>> >
>> > BUILD FAILED
>> >
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:222:
>> The following error occurred while executing this line:
>> >
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:203:
>> The following error occurred while executing this line:
>> >
>> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:90:
>> compile errors: 18
>> >
>> > Total time: 55 seconds
>> >
>> >
>> > ======================================================================
>> > ======================================================================
>> > STORE: saving artifacts
>> > ======================================================================
>> > ======================================================================
>> >
>> >
>> > mv: cannot stat `build/*.tar.gz': No such file or directory
>> > mv: cannot stat `build/test/findbugs': No such file or directory
>> > mv: cannot stat `build/docs/api': No such file or directory
>> > Build Failed
>> > [FINDBUGS] Skipping publisher since build result is FAILURE
>> > Archiving artifacts
>> > Publishing Clover coverage report...
>> > No Clover report will be published due to a Build Failure
>> > Recording test results
>> > Publishing Javadoc
>> > Recording fingerprints
>> > Updating HDFS-2235
>> > Email was triggered for: Failure
>> > Sending email for trigger: Failure
>> >
>> >
>> >
>> >
>> ###################################################################################
>> > ############################## FAILED TESTS (if any)
>> ##############################
>> > No tests ran.
>> >
>>
>

RE: Hadoop-Hdfs-trunk-Commit - Build # 829 - Still Failing

Posted by Eric Payne <er...@yahoo-inc.com>.
I'm seeing this error when I try to build a fresh checkout. 

I can get around it by removing the .m2 directory in my $HOME directory and then running 'mvn install -DskipTests' again in trun root.

However, test-patch still gets the error and fails the 'system test framework' build.

-Eric

-----Original Message-----
From: Alejandro Abdelnur [mailto:tucu@cloudera.com] 
Sent: Friday, August 12, 2011 12:41 AM
To: Eli Collins
Cc: hdfs-dev@hadoop.apache.org; Tom White
Subject: Re: Hadoop-Hdfs-trunk-Commit - Build # 829 - Still Failing

Eli,

I think you are right, I'm pretty sure it is picking up the latest deployed
snapshot.

I'll discuss with Tom tomorrow morning how to take care of this (once HDFS
is Mavenized we can easily build/use all latest bits from all modules, still
some tricks not to run all modules test will have to be done).

Thxs.

Alejandro

On Thu, Aug 11, 2011 at 10:20 PM, Eli Collins <el...@cloudera.com> wrote:

> Tucu and co - does hdfs build the latest common or does it try to
> resolve against the latest deployed common artifact?
> Looks like hudson-test-patch doesn't pick up on the latest common build.
>
>
>
> On Thu, Aug 11, 2011 at 10:11 PM, Apache Jenkins Server
> <je...@builds.apache.org> wrote:
> > See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/829/
> >
> >
> ###################################################################################
> > ########################## LAST 60 LINES OF THE CONSOLE
> ###########################
> > [...truncated 1273 lines...]
> >     [iajc]                      ^^^^^^^
> >     [iajc]
> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:117
> [error] The method getDecodedPath(HttpServletRequest, String) is undefined
> for the type ServletUtil
> >     [iajc] final String path = ServletUtil.getDecodedPath(request,
> "/data");
> >     [iajc]                                 ^^^
> >     [iajc]
> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:118
> [error] The method getRawPath(HttpServletRequest, String) is undefined for
> the type ServletUtil
> >     [iajc] final String encodedPath = ServletUtil.getRawPath(request,
> "/data");
> >     [iajc]
> >     [iajc]
> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:90
> [error] The method getDecodedPath(HttpServletRequest, String) is undefined
> for the type ServletUtil
> >     [iajc] final String path = ServletUtil.getDecodedPath(request,
> "/listPaths");
> >     [iajc]                                 ^^^^^^^^^
> >     [iajc]
> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:138
> [error] The method getDecodedPath(HttpServletRequest, String) is undefined
> for the type ServletUtil
> >     [iajc] final String filePath = ServletUtil.getDecodedPath(request,
> "/listPaths");
> >     [iajc]                                     ^^^^^^^^^
> >     [iajc]
> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:65
> [error] The method getDecodedPath(HttpServletRequest, String) is undefined
> for the type ServletUtil
> >     [iajc] final String path = ServletUtil.getDecodedPath(request,
> "/streamFile");
> >     [iajc]                                 ^^^^^^^^^
> >     [iajc]
> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:66
> [error] The method getRawPath(HttpServletRequest, String) is undefined for
> the type ServletUtil
> >     [iajc] final String rawPath = ServletUtil.getRawPath(request,
> "/streamFile");
> >     [iajc]                                    ^^^^^
> >     [iajc]
> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:67
> [warning] advice defined in
> org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied
> [Xlint:adviceDidNotMatch]
> >     [iajc]
> >     [iajc]
> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:60
> [warning] advice defined in
> org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied
> [Xlint:adviceDidNotMatch]
> >     [iajc]
> >     [iajc]
> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:50
> [warning] advice defined in
> org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied
> [Xlint:adviceDidNotMatch]
> >     [iajc]
> >     [iajc]
> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:43
> [warning] advice defined in
> org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied
> [Xlint:adviceDidNotMatch]
> >     [iajc]
> >     [iajc]
> >     [iajc] 18 errors, 4 warnings
> >
> > BUILD FAILED
> >
> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:222:
> The following error occurred while executing this line:
> >
> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:203:
> The following error occurred while executing this line:
> >
> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:90:
> compile errors: 18
> >
> > Total time: 55 seconds
> >
> >
> > ======================================================================
> > ======================================================================
> > STORE: saving artifacts
> > ======================================================================
> > ======================================================================
> >
> >
> > mv: cannot stat `build/*.tar.gz': No such file or directory
> > mv: cannot stat `build/test/findbugs': No such file or directory
> > mv: cannot stat `build/docs/api': No such file or directory
> > Build Failed
> > [FINDBUGS] Skipping publisher since build result is FAILURE
> > Archiving artifacts
> > Publishing Clover coverage report...
> > No Clover report will be published due to a Build Failure
> > Recording test results
> > Publishing Javadoc
> > Recording fingerprints
> > Updating HDFS-2235
> > Email was triggered for: Failure
> > Sending email for trigger: Failure
> >
> >
> >
> >
> ###################################################################################
> > ############################## FAILED TESTS (if any)
> ##############################
> > No tests ran.
> >
>

Re: Hadoop-Hdfs-trunk-Commit - Build # 829 - Still Failing

Posted by Alejandro Abdelnur <tu...@cloudera.com>.
Eli,

I think you are right, I'm pretty sure it is picking up the latest deployed
snapshot.

I'll discuss with Tom tomorrow morning how to take care of this (once HDFS
is Mavenized we can easily build/use all latest bits from all modules, still
some tricks not to run all modules test will have to be done).

Thxs.

Alejandro

On Thu, Aug 11, 2011 at 10:20 PM, Eli Collins <el...@cloudera.com> wrote:

> Tucu and co - does hdfs build the latest common or does it try to
> resolve against the latest deployed common artifact?
> Looks like hudson-test-patch doesn't pick up on the latest common build.
>
>
>
> On Thu, Aug 11, 2011 at 10:11 PM, Apache Jenkins Server
> <je...@builds.apache.org> wrote:
> > See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/829/
> >
> >
> ###################################################################################
> > ########################## LAST 60 LINES OF THE CONSOLE
> ###########################
> > [...truncated 1273 lines...]
> >     [iajc]                      ^^^^^^^
> >     [iajc]
> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:117
> [error] The method getDecodedPath(HttpServletRequest, String) is undefined
> for the type ServletUtil
> >     [iajc] final String path = ServletUtil.getDecodedPath(request,
> "/data");
> >     [iajc]                                 ^^^
> >     [iajc]
> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:118
> [error] The method getRawPath(HttpServletRequest, String) is undefined for
> the type ServletUtil
> >     [iajc] final String encodedPath = ServletUtil.getRawPath(request,
> "/data");
> >     [iajc]
> >     [iajc]
> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:90
> [error] The method getDecodedPath(HttpServletRequest, String) is undefined
> for the type ServletUtil
> >     [iajc] final String path = ServletUtil.getDecodedPath(request,
> "/listPaths");
> >     [iajc]                                 ^^^^^^^^^
> >     [iajc]
> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:138
> [error] The method getDecodedPath(HttpServletRequest, String) is undefined
> for the type ServletUtil
> >     [iajc] final String filePath = ServletUtil.getDecodedPath(request,
> "/listPaths");
> >     [iajc]                                     ^^^^^^^^^
> >     [iajc]
> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:65
> [error] The method getDecodedPath(HttpServletRequest, String) is undefined
> for the type ServletUtil
> >     [iajc] final String path = ServletUtil.getDecodedPath(request,
> "/streamFile");
> >     [iajc]                                 ^^^^^^^^^
> >     [iajc]
> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:66
> [error] The method getRawPath(HttpServletRequest, String) is undefined for
> the type ServletUtil
> >     [iajc] final String rawPath = ServletUtil.getRawPath(request,
> "/streamFile");
> >     [iajc]                                    ^^^^^
> >     [iajc]
> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:67
> [warning] advice defined in
> org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied
> [Xlint:adviceDidNotMatch]
> >     [iajc]
> >     [iajc]
> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:60
> [warning] advice defined in
> org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied
> [Xlint:adviceDidNotMatch]
> >     [iajc]
> >     [iajc]
> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:50
> [warning] advice defined in
> org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied
> [Xlint:adviceDidNotMatch]
> >     [iajc]
> >     [iajc]
> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:43
> [warning] advice defined in
> org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied
> [Xlint:adviceDidNotMatch]
> >     [iajc]
> >     [iajc]
> >     [iajc] 18 errors, 4 warnings
> >
> > BUILD FAILED
> >
> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:222:
> The following error occurred while executing this line:
> >
> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:203:
> The following error occurred while executing this line:
> >
> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:90:
> compile errors: 18
> >
> > Total time: 55 seconds
> >
> >
> > ======================================================================
> > ======================================================================
> > STORE: saving artifacts
> > ======================================================================
> > ======================================================================
> >
> >
> > mv: cannot stat `build/*.tar.gz': No such file or directory
> > mv: cannot stat `build/test/findbugs': No such file or directory
> > mv: cannot stat `build/docs/api': No such file or directory
> > Build Failed
> > [FINDBUGS] Skipping publisher since build result is FAILURE
> > Archiving artifacts
> > Publishing Clover coverage report...
> > No Clover report will be published due to a Build Failure
> > Recording test results
> > Publishing Javadoc
> > Recording fingerprints
> > Updating HDFS-2235
> > Email was triggered for: Failure
> > Sending email for trigger: Failure
> >
> >
> >
> >
> ###################################################################################
> > ############################## FAILED TESTS (if any)
> ##############################
> > No tests ran.
> >
>

Re: Hadoop-Hdfs-trunk-Commit - Build # 829 - Still Failing

Posted by Eli Collins <el...@cloudera.com>.
Tucu and co - does hdfs build the latest common or does it try to
resolve against the latest deployed common artifact?
Looks like hudson-test-patch doesn't pick up on the latest common build.



On Thu, Aug 11, 2011 at 10:11 PM, Apache Jenkins Server
<je...@builds.apache.org> wrote:
> See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/829/
>
> ###################################################################################
> ########################## LAST 60 LINES OF THE CONSOLE ###########################
> [...truncated 1273 lines...]
>     [iajc]                      ^^^^^^^
>     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:117 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
>     [iajc] final String path = ServletUtil.getDecodedPath(request, "/data");
>     [iajc]                                 ^^^
>     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:118 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil
>     [iajc] final String encodedPath = ServletUtil.getRawPath(request, "/data");
>     [iajc]
>     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:90 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
>     [iajc] final String path = ServletUtil.getDecodedPath(request, "/listPaths");
>     [iajc]                                 ^^^^^^^^^
>     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:138 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
>     [iajc] final String filePath = ServletUtil.getDecodedPath(request, "/listPaths");
>     [iajc]                                     ^^^^^^^^^
>     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:65 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
>     [iajc] final String path = ServletUtil.getDecodedPath(request, "/streamFile");
>     [iajc]                                 ^^^^^^^^^
>     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:66 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil
>     [iajc] final String rawPath = ServletUtil.getRawPath(request, "/streamFile");
>     [iajc]                                    ^^^^^
>     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:67 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch]
>     [iajc]
>     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:60 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch]
>     [iajc]
>     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:50 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch]
>     [iajc]
>     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:43 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch]
>     [iajc]
>     [iajc]
>     [iajc] 18 errors, 4 warnings
>
> BUILD FAILED
> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:222: The following error occurred while executing this line:
> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:203: The following error occurred while executing this line:
> /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:90: compile errors: 18
>
> Total time: 55 seconds
>
>
> ======================================================================
> ======================================================================
> STORE: saving artifacts
> ======================================================================
> ======================================================================
>
>
> mv: cannot stat `build/*.tar.gz': No such file or directory
> mv: cannot stat `build/test/findbugs': No such file or directory
> mv: cannot stat `build/docs/api': No such file or directory
> Build Failed
> [FINDBUGS] Skipping publisher since build result is FAILURE
> Archiving artifacts
> Publishing Clover coverage report...
> No Clover report will be published due to a Build Failure
> Recording test results
> Publishing Javadoc
> Recording fingerprints
> Updating HDFS-2235
> Email was triggered for: Failure
> Sending email for trigger: Failure
>
>
>
> ###################################################################################
> ############################## FAILED TESTS (if any) ##############################
> No tests ran.
>

Hadoop-Hdfs-trunk-Commit - Build # 829 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/829/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 1273 lines...]
     [iajc]                      ^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:117 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/data");
     [iajc]                                 ^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/FileDataServlet.java:118 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String encodedPath = ServletUtil.getRawPath(request, "/data");
     [iajc]                                        
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:90 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/listPaths");
     [iajc]                                 ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/ListPathsServlet.java:138 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String filePath = ServletUtil.getDecodedPath(request, "/listPaths");
     [iajc]                                     ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:65 [error] The method getDecodedPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String path = ServletUtil.getDecodedPath(request, "/streamFile");
     [iajc]                                 ^^^^^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/java/org/apache/hadoop/hdfs/server/namenode/StreamFile.java:66 [error] The method getRawPath(HttpServletRequest, String) is undefined for the type ServletUtil
     [iajc] final String rawPath = ServletUtil.getRawPath(request, "/streamFile");
     [iajc]                                    ^^^^^
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:67 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:60 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/namenode/NameNodeAspect.aj:50 [warning] advice defined in org.apache.hadoop.hdfs.server.namenode.NameNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/system/aop/org/apache/hadoop/hdfs/server/datanode/DataNodeAspect.aj:43 [warning] advice defined in org.apache.hadoop.hdfs.server.datanode.DataNodeAspect has not been applied [Xlint:adviceDidNotMatch]
     [iajc] 	
     [iajc] 
     [iajc] 18 errors, 4 warnings

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:222: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:203: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/src/test/aop/build/aop.xml:90: compile errors: 18

Total time: 55 seconds


======================================================================
======================================================================
STORE: saving artifacts
======================================================================
======================================================================


mv: cannot stat `build/*.tar.gz': No such file or directory
mv: cannot stat `build/test/findbugs': No such file or directory
mv: cannot stat `build/docs/api': No such file or directory
Build Failed
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Updating HDFS-2235
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk-Commit - Build # 828 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/828/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 2450 lines...]
    [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestOverReplicatedBlocks
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 4.098 sec
    [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestPendingReplication
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 7.215 sec
    [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestReplicationPolicy
    [junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 0 sec
    [junit] Test org.apache.hadoop.hdfs.server.blockmanagement.TestReplicationPolicy FAILED
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestBlockReplacement
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 21.412 sec
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestDirectoryScanner
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 4.248 sec
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestDiskError
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 9.497 sec
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestInterDatanodeProtocol
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 5.791 sec
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestSimulatedFSDataset
    [junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.687 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestBackupNode
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 20.013 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestCheckpoint
    [junit] Tests run: 27, Failures: 0, Errors: 1, Time elapsed: 72.235 sec
    [junit] Test org.apache.hadoop.hdfs.server.namenode.TestCheckpoint FAILED
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestEditLog
    [junit] Tests run: 13, Failures: 0, Errors: 0, Time elapsed: 23.117 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestFileLimit
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 4.466 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestNamenodeCapacityReport
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 2.723 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestSafeMode
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 4.152 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestStartup
    [junit] Tests run: 6, Failures: 0, Errors: 0, Time elapsed: 22.995 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestStorageRestore
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 15.869 sec
    [junit] Running org.apache.hadoop.net.TestNetworkTopology
    [junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.115 sec
    [junit] Running org.apache.hadoop.security.TestPermission
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 5.853 sec

checkfailure:
    [touch] Creating /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build/test/testsfailed

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:733: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:690: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:758: Tests failed!

Total time: 11 minutes 25 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Error updating JIRA issues. Saving issues for next build.
com.atlassian.jira.rpc.exception.RemotePermissionException: This issue does not exist or you don't have permission to view it.
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts

Error Message:
Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible.

Stack Trace:
org.apache.hadoop.hdfs.server.common.InconsistentFSStateException: Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible.
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$CheckpointStorage.recoverCreate(SecondaryNameNode.java:801)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:222)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:175)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:168)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.canStartSecondaryNode(TestHDFSServerPorts.java:224)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.__CLR2_4_3vpy47p10q7(TestHDFSServerPorts.java:350)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts(TestHDFSServerPorts.java:339)


FAILED:  org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1367)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:242)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:113)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:626)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:541)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:257)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:85)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:243)
	at org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.__CLR2_4_3harbaz16lv(TestCheckpoint.java:560)
	at org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking(TestCheckpoint.java:553)




Hadoop-Hdfs-trunk-Commit - Build # 827 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/827/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 2449 lines...]
    [junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.097 sec
    [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestOverReplicatedBlocks
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 4.059 sec
    [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestPendingReplication
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 7.22 sec
    [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestReplicationPolicy
    [junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 0 sec
    [junit] Test org.apache.hadoop.hdfs.server.blockmanagement.TestReplicationPolicy FAILED
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestBlockReplacement
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 21.53 sec
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestDirectoryScanner
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 4.345 sec
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestDiskError
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 9.393 sec
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestInterDatanodeProtocol
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 5.79 sec
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestSimulatedFSDataset
    [junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.749 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestBackupNode
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 20.426 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestCheckpoint
    [junit] Tests run: 27, Failures: 0, Errors: 1, Time elapsed: 75.523 sec
    [junit] Test org.apache.hadoop.hdfs.server.namenode.TestCheckpoint FAILED
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestEditLog
    [junit] Tests run: 13, Failures: 0, Errors: 0, Time elapsed: 22.78 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestFileLimit
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 4.564 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestNamenodeCapacityReport
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 2.809 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestSafeMode
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 3.995 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestStartup
    [junit] Tests run: 6, Failures: 0, Errors: 0, Time elapsed: 22.797 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestStorageRestore
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 16.167 sec
    [junit] Running org.apache.hadoop.net.TestNetworkTopology
    [junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.118 sec
    [junit] Running org.apache.hadoop.security.TestPermission
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 5.935 sec

checkfailure:
    [touch] Creating /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build/test/testsfailed

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:733: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:690: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:758: Tests failed!

Total time: 11 minutes 46 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Updating HADOOP-6158
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts

Error Message:
Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible.

Stack Trace:
org.apache.hadoop.hdfs.server.common.InconsistentFSStateException: Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible.
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$CheckpointStorage.recoverCreate(SecondaryNameNode.java:801)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:222)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:175)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:168)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.canStartSecondaryNode(TestHDFSServerPorts.java:224)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.__CLR2_4_3vpy47p10q7(TestHDFSServerPorts.java:350)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts(TestHDFSServerPorts.java:339)


FAILED:  org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1367)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:242)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:113)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:626)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:541)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:257)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:85)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:243)
	at org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.__CLR2_4_3harbaz16lv(TestCheckpoint.java:560)
	at org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking(TestCheckpoint.java:553)




Hadoop-Hdfs-trunk-Commit - Build # 826 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/826/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 2448 lines...]
    [junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.095 sec
    [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestOverReplicatedBlocks
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 4.116 sec
    [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestPendingReplication
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 7.217 sec
    [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestReplicationPolicy
    [junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 0 sec
    [junit] Test org.apache.hadoop.hdfs.server.blockmanagement.TestReplicationPolicy FAILED
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestBlockReplacement
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 21.559 sec
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestDirectoryScanner
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 4.282 sec
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestDiskError
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 9.604 sec
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestInterDatanodeProtocol
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 5.831 sec
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestSimulatedFSDataset
    [junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.684 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestBackupNode
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 19.929 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestCheckpoint
    [junit] Tests run: 27, Failures: 0, Errors: 1, Time elapsed: 72.463 sec
    [junit] Test org.apache.hadoop.hdfs.server.namenode.TestCheckpoint FAILED
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestEditLog
    [junit] Tests run: 13, Failures: 0, Errors: 0, Time elapsed: 23.348 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestFileLimit
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 4.225 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestNamenodeCapacityReport
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 2.766 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestSafeMode
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 4.333 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestStartup
    [junit] Tests run: 6, Failures: 0, Errors: 0, Time elapsed: 23.146 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestStorageRestore
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 15.854 sec
    [junit] Running org.apache.hadoop.net.TestNetworkTopology
    [junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.115 sec
    [junit] Running org.apache.hadoop.security.TestPermission
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 5.841 sec

checkfailure:
    [touch] Creating /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build/test/testsfailed

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:733: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:690: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:758: Tests failed!

Total time: 11 minutes 22 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Updating HDFS-2229
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts

Error Message:
Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible.

Stack Trace:
org.apache.hadoop.hdfs.server.common.InconsistentFSStateException: Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible.
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$CheckpointStorage.recoverCreate(SecondaryNameNode.java:801)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:222)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:175)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:168)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.canStartSecondaryNode(TestHDFSServerPorts.java:224)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.__CLR2_4_3vpy47p10p3(TestHDFSServerPorts.java:350)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts(TestHDFSServerPorts.java:339)


FAILED:  org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1367)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:242)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:113)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:626)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:541)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:257)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:85)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:243)
	at org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.__CLR2_4_3harbaz16k0(TestCheckpoint.java:560)
	at org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking(TestCheckpoint.java:553)




Hadoop-Hdfs-trunk-Commit - Build # 825 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/825/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 2447 lines...]
    [junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.096 sec
    [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestOverReplicatedBlocks
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 4.101 sec
    [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestPendingReplication
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 7.213 sec
    [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestReplicationPolicy
    [junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 0 sec
    [junit] Test org.apache.hadoop.hdfs.server.blockmanagement.TestReplicationPolicy FAILED
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestBlockReplacement
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 21.389 sec
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestDirectoryScanner
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 4.141 sec
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestDiskError
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 9.473 sec
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestInterDatanodeProtocol
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 5.664 sec
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestSimulatedFSDataset
    [junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.688 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestBackupNode
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 19.484 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestCheckpoint
    [junit] Tests run: 27, Failures: 0, Errors: 1, Time elapsed: 73.597 sec
    [junit] Test org.apache.hadoop.hdfs.server.namenode.TestCheckpoint FAILED
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestEditLog
    [junit] Tests run: 13, Failures: 0, Errors: 0, Time elapsed: 23.444 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestFileLimit
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 4.462 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestNamenodeCapacityReport
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 2.897 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestSafeMode
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 4.123 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestStartup
    [junit] Tests run: 6, Failures: 0, Errors: 0, Time elapsed: 23.219 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestStorageRestore
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 15.987 sec
    [junit] Running org.apache.hadoop.net.TestNetworkTopology
    [junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.115 sec
    [junit] Running org.apache.hadoop.security.TestPermission
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 5.83 sec

checkfailure:
    [touch] Creating /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build/test/testsfailed

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:733: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:690: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:758: Tests failed!

Total time: 11 minutes 16 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Updating HDFS-2245
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts

Error Message:
Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible.

Stack Trace:
org.apache.hadoop.hdfs.server.common.InconsistentFSStateException: Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible.
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$CheckpointStorage.recoverCreate(SecondaryNameNode.java:801)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:222)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:175)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:168)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.canStartSecondaryNode(TestHDFSServerPorts.java:224)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.__CLR2_4_3vpy47p10p1(TestHDFSServerPorts.java:350)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts(TestHDFSServerPorts.java:339)


FAILED:  org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1367)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:242)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:113)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:626)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:541)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:257)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:85)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:243)
	at org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.__CLR2_4_3harbaz16jy(TestCheckpoint.java:560)
	at org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking(TestCheckpoint.java:553)




Hadoop-Hdfs-trunk-Commit - Build # 824 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/824/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 2448 lines...]
    [junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.095 sec
    [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestOverReplicatedBlocks
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 4.03 sec
    [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestPendingReplication
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 7.215 sec
    [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestReplicationPolicy
    [junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 0 sec
    [junit] Test org.apache.hadoop.hdfs.server.blockmanagement.TestReplicationPolicy FAILED
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestBlockReplacement
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 18.493 sec
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestDirectoryScanner
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 4.324 sec
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestDiskError
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 9.649 sec
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestInterDatanodeProtocol
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 5.965 sec
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestSimulatedFSDataset
    [junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.692 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestBackupNode
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 19.808 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestCheckpoint
    [junit] Tests run: 27, Failures: 0, Errors: 1, Time elapsed: 74.15 sec
    [junit] Test org.apache.hadoop.hdfs.server.namenode.TestCheckpoint FAILED
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestEditLog
    [junit] Tests run: 13, Failures: 0, Errors: 0, Time elapsed: 23.634 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestFileLimit
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 4.684 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestNamenodeCapacityReport
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 2.764 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestSafeMode
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 4.145 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestStartup
    [junit] Tests run: 6, Failures: 0, Errors: 0, Time elapsed: 23.022 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestStorageRestore
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 15.781 sec
    [junit] Running org.apache.hadoop.net.TestNetworkTopology
    [junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.113 sec
    [junit] Running org.apache.hadoop.security.TestPermission
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 5.945 sec

checkfailure:
    [touch] Creating /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build/test/testsfailed

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:733: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:690: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:758: Tests failed!

Total time: 11 minutes 34 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Updating HDFS-2237
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts

Error Message:
Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible.

Stack Trace:
org.apache.hadoop.hdfs.server.common.InconsistentFSStateException: Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible.
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$CheckpointStorage.recoverCreate(SecondaryNameNode.java:801)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:222)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:175)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:168)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.canStartSecondaryNode(TestHDFSServerPorts.java:224)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.__CLR2_4_3vpy47p10oz(TestHDFSServerPorts.java:350)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts(TestHDFSServerPorts.java:339)


FAILED:  org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1367)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:242)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:113)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:626)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:541)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:257)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:85)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:243)
	at org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.__CLR2_4_3harbaz16jw(TestCheckpoint.java:560)
	at org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking(TestCheckpoint.java:553)




Hadoop-Hdfs-trunk-Commit - Build # 823 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Commit/823/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 2449 lines...]
    [junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.094 sec
    [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestOverReplicatedBlocks
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 4.122 sec
    [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestPendingReplication
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 7.213 sec
    [junit] Running org.apache.hadoop.hdfs.server.blockmanagement.TestReplicationPolicy
    [junit] Tests run: 1, Failures: 0, Errors: 1, Time elapsed: 0 sec
    [junit] Test org.apache.hadoop.hdfs.server.blockmanagement.TestReplicationPolicy FAILED
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestBlockReplacement
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 21.534 sec
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestDirectoryScanner
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 4.404 sec
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestDiskError
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 9.577 sec
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestInterDatanodeProtocol
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 5.807 sec
    [junit] Running org.apache.hadoop.hdfs.server.datanode.TestSimulatedFSDataset
    [junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.693 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestBackupNode
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 19.705 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestCheckpoint
    [junit] Tests run: 27, Failures: 0, Errors: 1, Time elapsed: 74.535 sec
    [junit] Test org.apache.hadoop.hdfs.server.namenode.TestCheckpoint FAILED
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestEditLog
    [junit] Tests run: 13, Failures: 0, Errors: 0, Time elapsed: 24.23 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestFileLimit
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 4.378 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestNamenodeCapacityReport
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 2.809 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestSafeMode
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 4.33 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestStartup
    [junit] Tests run: 6, Failures: 0, Errors: 0, Time elapsed: 23.226 sec
    [junit] Running org.apache.hadoop.hdfs.server.namenode.TestStorageRestore
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 16.441 sec
    [junit] Running org.apache.hadoop.net.TestNetworkTopology
    [junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.114 sec
    [junit] Running org.apache.hadoop.security.TestPermission
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 5.908 sec

checkfailure:
    [touch] Creating /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build/test/testsfailed

BUILD FAILED
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:733: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:690: The following error occurred while executing this line:
/home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Commit/trunk/build.xml:758: Tests failed!

Total time: 11 minutes 30 seconds
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure
Recording test results
Publishing Javadoc
Recording fingerprints
Updating HDFS-2241
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts

Error Message:
Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible.

Stack Trace:
org.apache.hadoop.hdfs.server.common.InconsistentFSStateException: Directory /test/dfs/namesecondary is in an inconsistent state: checkpoint directory does not exist or is not accessible.
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode$CheckpointStorage.recoverCreate(SecondaryNameNode.java:801)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.initialize(SecondaryNameNode.java:222)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:175)
	at org.apache.hadoop.hdfs.server.namenode.SecondaryNameNode.<init>(SecondaryNameNode.java:168)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.canStartSecondaryNode(TestHDFSServerPorts.java:224)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.__CLR2_4_3vpy47p10ox(TestHDFSServerPorts.java:350)
	at org.apache.hadoop.hdfs.TestHDFSServerPorts.testSecondaryNodePorts(TestHDFSServerPorts.java:339)


FAILED:  org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking

Error Message:
Cannot create directory /test/dfs/name/current

Stack Trace:
java.io.IOException: Cannot create directory /test/dfs/name/current
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:276)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:492)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:512)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:169)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1367)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:242)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:113)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:626)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:541)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:257)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:85)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:243)
	at org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.__CLR2_4_3harbaz16ju(TestCheckpoint.java:560)
	at org.apache.hadoop.hdfs.server.namenode.TestCheckpoint.testSeparateEditsDirLocking(TestCheckpoint.java:553)