You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-dev@hadoop.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2016/03/31 21:58:45 UTC

Hadoop-Hdfs-trunk-Java8 - Build # 1049 - Still Failing

See https://builds.apache.org/job/Hadoop-Hdfs-trunk-Java8/1049/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 6057 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.3:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.3:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.5:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.15:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:3.0.0:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS Client ......................... SUCCESS [04:08 min]
[INFO] Apache Hadoop HDFS ................................ FAILURE [  03:32 h]
[INFO] Apache Hadoop HDFS Native Client .................. SKIPPED
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [  0.105 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:36 h
[INFO] Finished at: 2016-03-31T19:58:27+00:00
[INFO] Final Memory: 56M/374M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk-Java8/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-hdfs
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.TestRefreshCallQueue.testRefresh

Error Message:
org.apache.hadoop.TestRefreshCallQueue$MockCallQueue could not be constructed.

Stack Trace:
java.lang.RuntimeException: org.apache.hadoop.TestRefreshCallQueue$MockCallQueue could not be constructed.
	at org.apache.hadoop.ipc.CallQueueManager.createCallQueueInstance(CallQueueManager.java:164)
	at org.apache.hadoop.ipc.CallQueueManager.<init>(CallQueueManager.java:70)
	at org.apache.hadoop.ipc.Server.<init>(Server.java:2579)
	at org.apache.hadoop.ipc.RPC$Server.<init>(RPC.java:958)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server.<init>(ProtobufRpcEngine.java:535)
	at org.apache.hadoop.ipc.ProtobufRpcEngine.getServer(ProtobufRpcEngine.java:510)
	at org.apache.hadoop.ipc.RPC$Builder.build(RPC.java:800)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.<init>(NameNodeRpcServer.java:430)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.createRpcServer(NameNode.java:759)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:701)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:900)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:879)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1596)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:1247)
	at org.apache.hadoop.hdfs.MiniDFSCluster.configureNameService(MiniDFSCluster.java:1016)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:891)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:823)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:482)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:441)
	at org.apache.hadoop.TestRefreshCallQueue.setUp(TestRefreshCallQueue.java:71)


FAILED:  org.apache.hadoop.hdfs.TestHFlush.testHFlushInterrupted

Error Message:
The stream is closed

Stack Trace:
java.io.IOException: The stream is closed
	at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:118)
	at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82)
	at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140)
	at java.io.DataOutputStream.flush(DataOutputStream.java:123)
	at java.io.FilterOutputStream.close(FilterOutputStream.java:158)
	at org.apache.hadoop.hdfs.DataStreamer.closeStream(DataStreamer.java:877)
	at org.apache.hadoop.hdfs.DataStreamer.closeInternal(DataStreamer.java:726)
	at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:721)


FAILED:  org.apache.hadoop.hdfs.server.namenode.ha.TestPipelinesFailover.testPipelineRecoveryStress

Error Message:
Deferred

Stack Trace:
java.lang.RuntimeException: Deferred
	at org.apache.hadoop.test.MultithreadedTestUtil$TestContext.checkException(MultithreadedTestUtil.java:130)
	at org.apache.hadoop.test.MultithreadedTestUtil$TestContext.waitFor(MultithreadedTestUtil.java:121)
	at org.apache.hadoop.hdfs.server.namenode.ha.TestPipelinesFailover.testPipelineRecoveryStress(TestPipelinesFailover.java:487)
Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/io/retry/AtMostOnce
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:116)
	at com.sun.proxy.$Proxy23.create(Unknown Source)
	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:242)
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1183)
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1125)
	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:415)
	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:412)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:426)
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:355)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:921)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:902)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:798)
	at org.apache.hadoop.hdfs.server.namenode.ha.TestPipelinesFailover$PipelineTestThread.doAnAction(TestPipelinesFailover.java:551)
	at org.apache.hadoop.test.MultithreadedTestUtil$RepeatingTestThread.doWork(MultithreadedTestUtil.java:222)
	at org.apache.hadoop.test.MultithreadedTestUtil$TestingThread.run(MultithreadedTestUtil.java:189)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.io.retry.AtMostOnce
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:116)
	at com.sun.proxy.$Proxy23.create(Unknown Source)
	at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:242)
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1183)
	at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1125)
	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:415)
	at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:412)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:426)
	at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:355)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:921)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:902)
	at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:798)
	at org.apache.hadoop.hdfs.server.namenode.ha.TestPipelinesFailover$PipelineTestThread.doAnAction(TestPipelinesFailover.java:551)
	at org.apache.hadoop.test.MultithreadedTestUtil$RepeatingTestThread.doWork(MultithreadedTestUtil.java:222)
	at org.apache.hadoop.test.MultithreadedTestUtil$TestingThread.run(MultithreadedTestUtil.java:189)


FAILED:  org.apache.hadoop.hdfs.server.namenode.ha.TestPipelinesFailover.testFailoverRightBeforeCommitSynchronization

Error Message:
Cannot obtain block length for LocatedBlock{BP-468873228-67.195.81.148-1459449294208:blk_1073741825_1001; getBlockSize()=2048; corrupt=false; offset=0; locs=[DatanodeInfoWithStorage[127.0.0.1:60141,DS-023ae67e-1008-4080-9691-c0acd9583ac6,DISK], DatanodeInfoWithStorage[127.0.0.1:38933,DS-620c5f95-fb2b-4cc3-aa5d-baede09a0eb5,DISK], DatanodeInfoWithStorage[127.0.0.1:52308,DS-52224881-a1c9-4198-8df6-eace139887cf,DISK]]}

Stack Trace:
java.io.IOException: Cannot obtain block length for LocatedBlock{BP-468873228-67.195.81.148-1459449294208:blk_1073741825_1001; getBlockSize()=2048; corrupt=false; offset=0; locs=[DatanodeInfoWithStorage[127.0.0.1:60141,DS-023ae67e-1008-4080-9691-c0acd9583ac6,DISK], DatanodeInfoWithStorage[127.0.0.1:38933,DS-620c5f95-fb2b-4cc3-aa5d-baede09a0eb5,DISK], DatanodeInfoWithStorage[127.0.0.1:52308,DS-52224881-a1c9-4198-8df6-eace139887cf,DISK]]}
	at org.apache.hadoop.hdfs.DFSInputStream.readBlockLength(DFSInputStream.java:434)
	at org.apache.hadoop.hdfs.DFSInputStream.fetchLocatedBlocksAndGetLastBlockLength(DFSInputStream.java:344)
	at org.apache.hadoop.hdfs.DFSInputStream.openInfo(DFSInputStream.java:277)
	at org.apache.hadoop.hdfs.DFSInputStream.<init>(DFSInputStream.java:266)
	at org.apache.hadoop.hdfs.DFSClient.open(DFSClient.java:1038)
	at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:272)
	at org.apache.hadoop.hdfs.DistributedFileSystem$3.doCall(DistributedFileSystem.java:268)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.open(DistributedFileSystem.java:280)
	at org.apache.hadoop.fs.FileSystem.open(FileSystem.java:777)
	at org.apache.hadoop.hdfs.DFSTestUtil.getFirstBlock(DFSTestUtil.java:784)
	at org.apache.hadoop.hdfs.server.namenode.ha.TestPipelinesFailover.testFailoverRightBeforeCommitSynchronization(TestPipelinesFailover.java:351)