You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-dev@hadoop.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2015/08/26 00:51:41 UTC

Hadoop-Mapreduce-trunk-Java8 - Build # 305 - Failure

See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/305/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 10707 lines...]
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.369 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapreduceConfigFields
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.647 sec - in org.apache.hadoop.mapreduce.TestMapreduceConfigFields

Results :

Failed tests: 
  TestRMContainerAllocator.testAttemptNotFoundCausesRMCommunicatorException Expected exception: org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocationException

Tests run: 335, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.130 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:16 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 26.787 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.438 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [10:11 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 12:03 min
[INFO] Finished at: 2015-08-25T22:52:36+00:00
[INFO] Final Memory: 40M/247M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Sending artifact delta relative to Hadoop-Mapreduce-trunk-Java8 #304
Archived 1 artifacts
Archive block size is 32768
Received 0 blocks and 20468911 bytes
Compression is 0.0%
Took 7.1 sec
Recording test results
Updating HDFS-8846
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
REGRESSION:  org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testAttemptNotFoundCausesRMCommunicatorException

Error Message:
Expected exception: org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocationException

Stack Trace:
java.lang.AssertionError: Expected exception: org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocationException
	at org.junit.internal.runners.statements.ExpectException.evaluate(ExpectException.java:32)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)



Hadoop-Mapreduce-trunk-Java8 - Build # 306 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/306/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 11342 lines...]
Running org.apache.hadoop.mapreduce.v2.TestUberAM

Results :

Tests in error: 
  TestSpecialCharactersInOutputPath.testJobWithDFS:109 » NoClassDefFound org/apa...
  TestMerge.testMerge:82 » YarnRuntime java.lang.NoClassDefFoundError: org/apach...
  TestNetworkedJob.testGetJobStatus:101 » IO Cannot initialize Cluster. Please c...
  TestNetworkedJob.testJobQueueClient:318->createMiniClusterWithCapacityScheduler:399 » Runtime
  TestNetworkedJob.testNetworkedJob:133->createMiniClusterWithCapacityScheduler:399 » Runtime

Tests run: 317, Failures: 0, Errors: 5, Skipped: 8

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.599 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:20 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 26.950 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.003 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:23 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:09 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:05 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:24 h
[INFO] Finished at: 2015-08-27T00:16:05+00:00
[INFO] Final Memory: 40M/245M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx4096m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter5600633472778433465.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire3670349501822520576tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_1493351779764478797929tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Sending artifact delta relative to Hadoop-Mapreduce-trunk-Java8 #304
Archived 1 artifacts
Archive block size is 32768
Received 0 blocks and 20466836 bytes
Compression is 0.0%
Took 5.2 sec
Recording test results
Updating HDFS-8951
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMerge.testMerge

Error Message:
java.lang.NoClassDefFoundError: org/apache/hadoop/hdfs/protocol/proto/ClientNamenodeProtocolProtos$GetListingRequestProto$Builder

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.lang.NoClassDefFoundError: org/apache/hadoop/hdfs/protocol/proto/ClientNamenodeProtocolProtos$GetListingRequestProto$Builder
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$GetListingRequestProto.newBuilder(ClientNamenodeProtocolProtos.java:29094)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getListing(ClientNamenodeProtocolTranslatorPB.java:569)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:251)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
	at com.sun.proxy.$Proxy19.getListing(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:1648)
	at org.apache.hadoop.fs.Hdfs$DirListingIterator.<init>(Hdfs.java:210)
	at org.apache.hadoop.fs.Hdfs$DirListingIterator.<init>(Hdfs.java:197)
	at org.apache.hadoop.fs.Hdfs$2.<init>(Hdfs.java:179)
	at org.apache.hadoop.fs.Hdfs.listStatusIterator(Hdfs.java:179)
	at org.apache.hadoop.fs.FileContext$22.next(FileContext.java:1489)
	at org.apache.hadoop.fs.FileContext$22.next(FileContext.java:1484)
	at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90)
	at org.apache.hadoop.fs.FileContext.listStatus(FileContext.java:1484)
	at org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:457)
	at org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:444)
	at org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:439)
	at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.findTimestampedDirectories(HistoryFileManager.java:778)
	at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.initExisting(HistoryFileManager.java:672)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistory.serviceInit(JobHistory.java:97)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:107)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.serviceInit(JobHistoryServer.java:151)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster$JobHistoryServerWrapper.serviceStart(MiniMRYarnCluster.java:210)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:80)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:41)
	at org.apache.hadoop.mapred.TestMerge.testMerge(TestMerge.java:82)


FAILED:  org.apache.hadoop.mapred.TestNetworkedJob.testGetJobStatus

Error Message:
Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses.

Stack Trace:
java.io.IOException: Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses.
	at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:120)
	at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:82)
	at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:75)
	at org.apache.hadoop.mapred.JobClient.init(JobClient.java:475)
	at org.apache.hadoop.mapred.JobClient.<init>(JobClient.java:465)
	at org.apache.hadoop.mapred.TestNetworkedJob.testGetJobStatus(TestNetworkedJob.java:101)


FAILED:  org.apache.hadoop.mapred.TestNetworkedJob.testJobQueueClient

Error Message:
java.util.zip.ZipException: invalid code lengths set

Stack Trace:
java.lang.RuntimeException: java.util.zip.ZipException: invalid code lengths set
	at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:164)
	at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:122)
	at java.io.FilterInputStream.read(FilterInputStream.java:83)
	at org.apache.xerces.impl.XMLEntityManager$RewindableInputStream.read(Unknown Source)
	at org.apache.xerces.impl.XMLEntityManager.setupCurrentEntity(Unknown Source)
	at org.apache.xerces.impl.XMLVersionDetector.determineDocVersion(Unknown Source)
	at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
	at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
	at org.apache.xerces.parsers.XMLParser.parse(Unknown Source)
	at org.apache.xerces.parsers.DOMParser.parse(Unknown Source)
	at org.apache.xerces.jaxp.DocumentBuilderImpl.parse(Unknown Source)
	at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:150)
	at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2551)
	at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2539)
	at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2610)
	at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2563)
	at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2474)
	at org.apache.hadoop.conf.Configuration.set(Configuration.java:1210)
	at org.apache.hadoop.conf.Configuration.set(Configuration.java:1182)
	at org.apache.hadoop.conf.Configuration.setClass(Configuration.java:2339)
	at org.apache.hadoop.mapred.TestNetworkedJob.createMiniClusterWithCapacityScheduler(TestNetworkedJob.java:399)
	at org.apache.hadoop.mapred.TestNetworkedJob.testJobQueueClient(TestNetworkedJob.java:318)


FAILED:  org.apache.hadoop.mapred.TestNetworkedJob.testNetworkedJob

Error Message:
java.util.zip.ZipException: invalid code lengths set

Stack Trace:
java.lang.RuntimeException: java.util.zip.ZipException: invalid code lengths set
	at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:164)
	at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:122)
	at java.io.FilterInputStream.read(FilterInputStream.java:83)
	at org.apache.xerces.impl.XMLEntityManager$RewindableInputStream.read(Unknown Source)
	at org.apache.xerces.impl.XMLEntityManager.setupCurrentEntity(Unknown Source)
	at org.apache.xerces.impl.XMLVersionDetector.determineDocVersion(Unknown Source)
	at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
	at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
	at org.apache.xerces.parsers.XMLParser.parse(Unknown Source)
	at org.apache.xerces.parsers.DOMParser.parse(Unknown Source)
	at org.apache.xerces.jaxp.DocumentBuilderImpl.parse(Unknown Source)
	at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:150)
	at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2551)
	at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2539)
	at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2610)
	at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2563)
	at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2474)
	at org.apache.hadoop.conf.Configuration.set(Configuration.java:1210)
	at org.apache.hadoop.conf.Configuration.set(Configuration.java:1182)
	at org.apache.hadoop.conf.Configuration.setClass(Configuration.java:2339)
	at org.apache.hadoop.mapred.TestNetworkedJob.createMiniClusterWithCapacityScheduler(TestNetworkedJob.java:399)
	at org.apache.hadoop.mapred.TestNetworkedJob.testNetworkedJob(TestNetworkedJob.java:133)


FAILED:  org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath.testJobWithDFS

Error Message:
org/apache/hadoop/HadoopIllegalArgumentException

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/HadoopIllegalArgumentException
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initNameNodeAddress(MiniDFSCluster.java:1176)
	at org.apache.hadoop.hdfs.MiniDFSCluster.configureNameNodes(MiniDFSCluster.java:1038)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:880)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:818)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:477)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:436)
	at org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath.testJobWithDFS(TestSpecialCharactersInOutputPath.java:109)