You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-dev@hadoop.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2015/09/16 04:54:52 UTC
Hadoop-Hdfs-trunk-Java8 - Build # 376 - Still Failing
See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/376/
###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 7922 lines...]
[INFO]
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO]
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO]
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO]
[INFO] --- maven-site-plugin:3.4:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO]
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO]
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO]
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO]
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop HDFS Client ......................... SUCCESS [03:31 min]
[INFO] Apache Hadoop HDFS ................................ FAILURE [ 02:50 h]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [ 0.066 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:53 h
[INFO] Finished at: 2015-09-16T02:54:36+00:00
[INFO] Final Memory: 75M/754M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-hdfs: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx4096m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefirebooter5619433102992303020.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire2844350229076253926tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire_2928530195832539987682tmp
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Sending artifact delta relative to Hadoop-Hdfs-trunk-Java8 #222
Archived 1 artifacts
Archive block size is 32768
Received 0 blocks and 5377048 bytes
Compression is 0.0%
Took 2.7 sec
Recording test results
Updating HDFS-7986
Updating HDFS-9082
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure
###################################################################################
############################## FAILED TESTS (if any) ##############################
7 tests failed.
FAILED: org.apache.hadoop.hdfs.server.balancer.TestBalancerWithSaslDataTransfer.testBalancer0Integrity
Error Message:
org/apache/hadoop/util/IntrusiveCollection$IntrusiveIterator
Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/util/IntrusiveCollection$IntrusiveIterator
at org.apache.hadoop.util.IntrusiveCollection.iterator(IntrusiveCollection.java:213)
at org.apache.hadoop.util.IntrusiveCollection.clear(IntrusiveCollection.java:368)
at org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.clearPendingCachingCommands(DatanodeManager.java:1587)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.stopActiveServices(FSNamesystem.java:1212)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.close(FSNamesystem.java:1539)
at org.apache.hadoop.hdfs.server.namenode.NameNode.stopCommonServices(NameNode.java:759)
at org.apache.hadoop.hdfs.server.namenode.NameNode.stop(NameNode.java:924)
at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1866)
at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1835)
at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1828)
at org.apache.hadoop.hdfs.server.balancer.TestBalancer.doTest(TestBalancer.java:665)
at org.apache.hadoop.hdfs.server.balancer.TestBalancer.doTest(TestBalancer.java:547)
at org.apache.hadoop.hdfs.server.balancer.TestBalancer.oneNodeTest(TestBalancer.java:812)
at org.apache.hadoop.hdfs.server.balancer.TestBalancer.testBalancer0Internal(TestBalancer.java:926)
at org.apache.hadoop.hdfs.server.balancer.TestBalancerWithSaslDataTransfer.testBalancer0Integrity(TestBalancerWithSaslDataTransfer.java:34)
FAILED: org.apache.hadoop.hdfs.server.balancer.TestBalancerWithSaslDataTransfer.testBalancer0Authentication
Error Message:
Test resulted in an unexpected exit
Stack Trace:
java.lang.AssertionError: Test resulted in an unexpected exit
at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1848)
at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1835)
at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1828)
at org.apache.hadoop.hdfs.server.balancer.TestBalancer.doTest(TestBalancer.java:665)
at org.apache.hadoop.hdfs.server.balancer.TestBalancer.doTest(TestBalancer.java:547)
at org.apache.hadoop.hdfs.server.balancer.TestBalancer.oneNodeTest(TestBalancer.java:812)
at org.apache.hadoop.hdfs.server.balancer.TestBalancer.testBalancer0Internal(TestBalancer.java:926)
at org.apache.hadoop.hdfs.server.balancer.TestBalancerWithSaslDataTransfer.testBalancer0Authentication(TestBalancerWithSaslDataTransfer.java:29)
FAILED: org.apache.hadoop.hdfs.server.balancer.TestBalancerWithSaslDataTransfer.testBalancer0Privacy
Error Message:
org/apache/hadoop/util/IntrusiveCollection$IntrusiveIterator
Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/util/IntrusiveCollection$IntrusiveIterator
at org.apache.hadoop.util.IntrusiveCollection.iterator(IntrusiveCollection.java:213)
at org.apache.hadoop.util.IntrusiveCollection.clear(IntrusiveCollection.java:368)
at org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.clearPendingCachingCommands(DatanodeManager.java:1587)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.stopActiveServices(FSNamesystem.java:1212)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.close(FSNamesystem.java:1539)
at org.apache.hadoop.hdfs.server.namenode.NameNode.stopCommonServices(NameNode.java:759)
at org.apache.hadoop.hdfs.server.namenode.NameNode.stop(NameNode.java:924)
at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1866)
at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1835)
at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1828)
at org.apache.hadoop.hdfs.server.balancer.TestBalancer.doTest(TestBalancer.java:665)
at org.apache.hadoop.hdfs.server.balancer.TestBalancer.doTest(TestBalancer.java:547)
at org.apache.hadoop.hdfs.server.balancer.TestBalancer.oneNodeTest(TestBalancer.java:812)
at org.apache.hadoop.hdfs.server.balancer.TestBalancer.testBalancer0Internal(TestBalancer.java:926)
at org.apache.hadoop.hdfs.server.balancer.TestBalancerWithSaslDataTransfer.testBalancer0Privacy(TestBalancerWithSaslDataTransfer.java:39)
FAILED: org.apache.hadoop.hdfs.web.TestWebHDFSOAuth2.listStatusReturnsAsExpected
Error Message:
Unable to load OAuth2 connection factory.
Stack Trace:
java.io.IOException: Unable to load OAuth2 connection factory.
at java.io.FileInputStream.open(Native Method)
at java.io.FileInputStream.<init>(FileInputStream.java:131)
at org.apache.hadoop.security.ssl.ReloadingX509TrustManager.loadTrustManager(ReloadingX509TrustManager.java:164)
at org.apache.hadoop.security.ssl.ReloadingX509TrustManager.<init>(ReloadingX509TrustManager.java:81)
at org.apache.hadoop.security.ssl.FileBasedKeyStoresFactory.init(FileBasedKeyStoresFactory.java:215)
at org.apache.hadoop.security.ssl.SSLFactory.init(SSLFactory.java:131)
at org.apache.hadoop.hdfs.web.URLConnectionFactory.newSslConnConfigurator(URLConnectionFactory.java:135)
at org.apache.hadoop.hdfs.web.URLConnectionFactory.newOAuth2URLConnectionFactory(URLConnectionFactory.java:111)
at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.initialize(WebHdfsFileSystem.java:159)
at org.apache.hadoop.hdfs.web.TestWebHDFSOAuth2.listStatusReturnsAsExpected(TestWebHDFSOAuth2.java:147)
FAILED: org.apache.hadoop.hdfs.server.namenode.ha.TestSeveralNameNodes.testCircularLinkedListWrites
Error Message:
Some writers didn't complete in expected runtime! Current writer state:[Circular Writer:
directory: /test-0
target length: 50
current item: 40
done: false
, Circular Writer:
directory: /test-1
target length: 50
current item: 50
done: false
] expected:<0> but was:<2>
Stack Trace:
java.lang.AssertionError: Some writers didn't complete in expected runtime! Current writer state:[Circular Writer:
directory: /test-0
target length: 50
current item: 40
done: false
, Circular Writer:
directory: /test-1
target length: 50
current item: 50
done: false
] expected:<0> but was:<2>
at org.junit.Assert.fail(Assert.java:88)
at org.junit.Assert.failNotEquals(Assert.java:743)
at org.junit.Assert.assertEquals(Assert.java:118)
at org.junit.Assert.assertEquals(Assert.java:555)
at org.apache.hadoop.hdfs.server.namenode.ha.TestSeveralNameNodes.testCircularLinkedListWrites(TestSeveralNameNodes.java:90)
FAILED: org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.TestLazyPersistLockedMemory.testReleaseOnEviction
Error Message:
expected:<1> but was:<0>
Stack Trace:
java.lang.AssertionError: expected:<1> but was:<0>
at org.junit.Assert.fail(Assert.java:88)
at org.junit.Assert.failNotEquals(Assert.java:743)
at org.junit.Assert.assertEquals(Assert.java:118)
at org.junit.Assert.assertEquals(Assert.java:555)
at org.junit.Assert.assertEquals(Assert.java:542)
at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.LazyPersistTestCase.verifyRamDiskJMXMetric(LazyPersistTestCase.java:483)
at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.TestLazyPersistLockedMemory.testReleaseOnEviction(TestLazyPersistLockedMemory.java:122)
FAILED: org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.TestLazyPersistReplicaPlacement.testSynchronousEviction
Error Message:
expected:<1> but was:<0>
Stack Trace:
java.lang.AssertionError: expected:<1> but was:<0>
at org.junit.Assert.fail(Assert.java:88)
at org.junit.Assert.failNotEquals(Assert.java:743)
at org.junit.Assert.assertEquals(Assert.java:118)
at org.junit.Assert.assertEquals(Assert.java:555)
at org.junit.Assert.assertEquals(Assert.java:542)
at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.LazyPersistTestCase.verifyRamDiskJMXMetric(LazyPersistTestCase.java:483)
at org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.TestLazyPersistReplicaPlacement.testSynchronousEviction(TestLazyPersistReplicaPlacement.java:92)