You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-dev@hadoop.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2016/03/29 08:23:45 UTC

Hadoop-Mapreduce-trunk-Java8 - Build # 1199 - Still Failing

See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1199/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8473 lines...]
Running org.apache.hadoop.mapreduce.lib.input.TestLineRecordReader
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.392 sec - in org.apache.hadoop.mapreduce.lib.input.TestLineRecordReader
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.input.TestCombineFileRecordReader
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.282 sec - in org.apache.hadoop.mapreduce.lib.input.TestCombineFileRecordReader
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.partition.TestRehashPartitioner
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.246 sec - in org.apache.hadoop.mapreduce.lib.partition.TestRehashPartitioner
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestHistoryViewerPrinter
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.781 sec - in org.apache.hadoop.mapreduce.jobhistory.TestHistoryViewerPrinter

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 241, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.833 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:52 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:56 min
[INFO] Finished at: 2016-03-29T06:23:41+00:00
[INFO] Final Memory: 32M/200M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk-Java8 - Build # 1415 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1415/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8383 lines...]
Running org.apache.hadoop.mapred.TestFileInputFormat
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.911 sec - in org.apache.hadoop.mapred.TestFileInputFormat
Running org.apache.hadoop.mapred.TestJobEndNotifier
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.205 sec - in org.apache.hadoop.mapred.TestJobEndNotifier
Running org.apache.hadoop.mapred.TestOldMethodsJobID
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.287 sec - in org.apache.hadoop.mapred.TestOldMethodsJobID
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.09 sec - in org.apache.hadoop.mapred.TestJobQueueClient
Running org.apache.hadoop.mapred.TestJobConf
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.548 sec - in org.apache.hadoop.mapred.TestJobConf

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 244, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.300 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:40 min
[INFO] Finished at: 2016-05-17T15:21:10+00:00
[INFO] Final Memory: 32M/198M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)




Hadoop-Mapreduce-trunk-Java8 - Build # 1414 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1414/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8664 lines...]
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebApp
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.612 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebApp
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.197 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.148 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.333 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.712 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher

Results :

Failed tests: 
  TestRecovery.testCrashed:188 TaskAttempt state is not correct (timedout) expected:<FAILED> but was:<STARTING>

Tests run: 348, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.281 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 25.590 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.869 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [10:25 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 12:37 min
[INFO] Finished at: 2016-05-17T14:32:12+00:00
[INFO] Final Memory: 36M/203M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.TestRecovery.testCrashed

Error Message:
TaskAttempt state is not correct (timedout) expected:<FAILED> but was:<STARTING>

Stack Trace:
java.lang.AssertionError: TaskAttempt state is not correct (timedout) expected:<FAILED> but was:<STARTING>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.waitForState(MRApp.java:382)
	at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testCrashed(TestRecovery.java:188)




Hadoop-Mapreduce-trunk-Java8 - Build # 1413 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1413/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8665 lines...]
Running org.apache.hadoop.mapreduce.v2.app.TestFail
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.308 sec - in org.apache.hadoop.mapreduce.v2.app.TestFail
Running org.apache.hadoop.mapreduce.v2.app.TestMRApp
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.332 sec - in org.apache.hadoop.mapreduce.v2.app.TestMRApp
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.299 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.112 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.71 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher

Results :

Failed tests: 
  TestRecovery.testCrashed:188 TaskAttempt state is not correct (timedout) expected:<FAILED> but was:<STARTING>

Tests run: 348, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.351 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:42 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 25.421 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.339 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [10:32 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 12:47 min
[INFO] Finished at: 2016-05-17T13:32:27+00:00
[INFO] Final Memory: 36M/235M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.TestRecovery.testCrashed

Error Message:
TaskAttempt state is not correct (timedout) expected:<FAILED> but was:<STARTING>

Stack Trace:
java.lang.AssertionError: TaskAttempt state is not correct (timedout) expected:<FAILED> but was:<STARTING>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.waitForState(MRApp.java:382)
	at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testCrashed(TestRecovery.java:188)




Hadoop-Mapreduce-trunk-Java8 - Build # 1412 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1412/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8385 lines...]
Running org.apache.hadoop.mapred.TestFileInputFormat
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.847 sec - in org.apache.hadoop.mapred.TestFileInputFormat
Running org.apache.hadoop.mapred.TestJobEndNotifier
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.131 sec - in org.apache.hadoop.mapred.TestJobEndNotifier
Running org.apache.hadoop.mapred.TestOldMethodsJobID
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.283 sec - in org.apache.hadoop.mapred.TestOldMethodsJobID
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.077 sec - in org.apache.hadoop.mapred.TestJobQueueClient
Running org.apache.hadoop.mapred.TestJobConf
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.438 sec - in org.apache.hadoop.mapred.TestJobConf

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 244, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.284 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:39 min
[INFO] Finished at: 2016-05-17T12:20:59+00:00
[INFO] Final Memory: 31M/192M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)




Hadoop-Mapreduce-trunk-Java8 - Build # 1411 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1411/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8383 lines...]
Running org.apache.hadoop.mapred.TestFileInputFormat
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.79 sec - in org.apache.hadoop.mapred.TestFileInputFormat
Running org.apache.hadoop.mapred.TestJobEndNotifier
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.881 sec - in org.apache.hadoop.mapred.TestJobEndNotifier
Running org.apache.hadoop.mapred.TestOldMethodsJobID
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.276 sec - in org.apache.hadoop.mapred.TestOldMethodsJobID
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.104 sec - in org.apache.hadoop.mapred.TestJobQueueClient
Running org.apache.hadoop.mapred.TestJobConf
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.439 sec - in org.apache.hadoop.mapred.TestJobConf

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 244, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.174 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:39 min
[INFO] Finished at: 2016-05-17T07:21:10+00:00
[INFO] Final Memory: 32M/185M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)




Hadoop-Mapreduce-trunk-Java8 - Build # 1410 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1410/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9196 lines...]
Running org.apache.hadoop.mapred.TestMiniMRDFSCaching
Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.039 sec - in org.apache.hadoop.mapred.TestMiniMRDFSCaching
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.075 sec - in org.apache.hadoop.hdfs.TestNNBench
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.828 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.881 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 533, Failures: 3, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.167 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.764 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.316 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:03 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:20 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:42 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:00 h
[INFO] Finished at: 2016-05-17T06:20:42+00:00
[INFO] Final Memory: 34M/147M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1409 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1409/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9196 lines...]
Running org.apache.hadoop.mapred.TestMiniMRDFSCaching
Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.031 sec - in org.apache.hadoop.mapred.TestMiniMRDFSCaching
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.688 sec - in org.apache.hadoop.hdfs.TestNNBench
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.501 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.719 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 533, Failures: 3, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.372 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 25.048 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.431 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:13 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:23 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:50 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:08 h
[INFO] Finished at: 2016-05-17T04:14:51+00:00
[INFO] Final Memory: 34M/153M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1408 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1408/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9276 lines...]
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.65 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests in error: 
  TestJobOutputCommitter.tearDown:72->HadoopTestCase.tearDown:170 » NoClassDefFound
  TestJobOutputCommitter.setUp:64->HadoopTestCase.setUp:156 » YarnRuntime could ...
  TestJobOutputCommitter.tearDown:71 NullPointer
  TestJobOutputCommitter.setUp:64->HadoopTestCase.setUp:156 » YarnRuntime could ...
  TestJobOutputCommitter.tearDown:71 NullPointer

Tests run: 535, Failures: 2, Errors: 5, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.223 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:39 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.883 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.285 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:07 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:18 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:45 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:03 h
[INFO] Finished at: 2016-05-17T02:00:41+00:00
[INFO] Final Memory: 34M/138M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
7 tests failed.
FAILED:  org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.testCustomCleanup

Error Message:
org/apache/hadoop/service/ServiceOperations

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/service/ServiceOperations
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
	at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapred.MiniMRYarnClusterAdapter.stop(MiniMRYarnClusterAdapter.java:55)
	at org.apache.hadoop.mapred.MiniMRCluster.shutdown(MiniMRCluster.java:267)
	at org.apache.hadoop.mapred.HadoopTestCase.tearDown(HadoopTestCase.java:170)
	at org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.tearDown(TestJobOutputCommitter.java:72)


FAILED:  org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.testDefaultCleanupAndAbort

Error Message:
could not cleanup test dir: org.apache.hadoop.fs.UnsupportedFileSystemException: fs.AbstractFileSystem.file.impl=null: No AbstractFileSystem configured for scheme: file

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: could not cleanup test dir: org.apache.hadoop.fs.UnsupportedFileSystemException: fs.AbstractFileSystem.file.impl=null: No AbstractFileSystem configured for scheme: file
	at org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:161)
	at org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:250)
	at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:332)
	at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:329)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1755)
	at org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:329)
	at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:446)
	at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:423)
	at org.apache.hadoop.fs.FileContext.getLocalFSFileContext(FileContext.java:409)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.<init>(MiniYARNCluster.java:161)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.<init>(MiniMRYarnCluster.java:79)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.<init>(MiniMRYarnCluster.java:75)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:73)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:159)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:152)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:145)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:138)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:133)
	at org.apache.hadoop.mapred.HadoopTestCase.setUp(HadoopTestCase.java:156)
	at org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.setUp(TestJobOutputCommitter.java:64)


FAILED:  org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.testDefaultCleanupAndAbort

Error Message:
null

Stack Trace:
java.lang.NullPointerException: null
	at org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.tearDown(TestJobOutputCommitter.java:71)


FAILED:  org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.testCustomAbort

Error Message:
could not cleanup test dir: org.apache.hadoop.fs.UnsupportedFileSystemException: fs.AbstractFileSystem.file.impl=null: No AbstractFileSystem configured for scheme: file

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: could not cleanup test dir: org.apache.hadoop.fs.UnsupportedFileSystemException: fs.AbstractFileSystem.file.impl=null: No AbstractFileSystem configured for scheme: file
	at org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:161)
	at org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:250)
	at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:332)
	at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:329)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1755)
	at org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:329)
	at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:446)
	at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:423)
	at org.apache.hadoop.fs.FileContext.getLocalFSFileContext(FileContext.java:409)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.<init>(MiniYARNCluster.java:161)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.<init>(MiniMRYarnCluster.java:79)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.<init>(MiniMRYarnCluster.java:75)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:73)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:159)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:152)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:145)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:138)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:133)
	at org.apache.hadoop.mapred.HadoopTestCase.setUp(HadoopTestCase.java:156)
	at org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.setUp(TestJobOutputCommitter.java:64)


FAILED:  org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.testCustomAbort

Error Message:
null

Stack Trace:
java.lang.NullPointerException: null
	at org.apache.hadoop.mapreduce.lib.output.TestJobOutputCommitter.tearDown(TestJobOutputCommitter.java:71)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1407 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1407/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9505 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 72.525 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.143 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.424 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 533, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.181 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:39 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 25.416 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.229 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:11 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:23 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:43 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:01 h
[INFO] Finished at: 2016-05-16T23:51:50+00:00
[INFO] Final Memory: 34M/144M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1406 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1406/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9518 lines...]
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.323 sec - in org.apache.hadoop.mapred.pipes.TestPipeApplication
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.pipes.TestPipes
Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.029 sec - in org.apache.hadoop.mapred.pipes.TestPipes
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.548 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestReporter
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.736 sec - in org.apache.hadoop.mapred.TestReporter

Results :

Failed tests: 
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 530, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.185 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.658 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.296 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:08 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:27 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:04 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:22 h
[INFO] Finished at: 2016-05-16T21:42:34+00:00
[INFO] Final Memory: 39M/150M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter264982520343454756.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire4268075182468606055tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_2782730037742647121922tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1405 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1405/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9513 lines...]
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.071 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.319 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.283 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 533, Failures: 3, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.373 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.712 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.340 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:08 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:26 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:44 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:02 h
[INFO] Finished at: 2016-05-16T18:22:01+00:00
[INFO] Final Memory: 34M/157M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1404 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1404/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9515 lines...]
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.484 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.839 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.977 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 533, Failures: 3, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.305 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.732 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.253 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:04 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:19 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:47 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:06 h
[INFO] Finished at: 2016-05-16T10:44:58+00:00
[INFO] Final Memory: 34M/162M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1403 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1403/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9515 lines...]
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.657 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.509 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.237 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 533, Failures: 3, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.880 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:56 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 30.711 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.994 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [12:02 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:42 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:51 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:13 h
[INFO] Finished at: 2016-05-16T08:33:25+00:00
[INFO] Final Memory: 34M/141M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1402 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1402/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9238 lines...]
Running org.apache.hadoop.mapred.lib.TestDelegatingInputFormat
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.47 sec - in org.apache.hadoop.mapred.lib.TestDelegatingInputFormat
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.lib.db.TestConstructQuery
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.613 sec - in org.apache.hadoop.mapred.lib.db.TestConstructQuery
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.lib.TestChainMapReduce
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.949 sec - in org.apache.hadoop.mapred.lib.TestChainMapReduce
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.lib.TestMultithreadedMapRunner
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.125 sec - in org.apache.hadoop.mapred.lib.TestMultithreadedMapRunner
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.lib.TestMultipleInputs
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.626 sec - in org.apache.hadoop.mapred.lib.TestMultipleInputs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestYARNRunner
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 32.925 sec - in org.apache.hadoop.mapred.TestYARNRunner
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestMerge
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 83.169 sec - in org.apache.hadoop.mapred.TestMerge
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryInputFormat
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.436 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryInputFormat
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestClusterMRNotification
Slave went offline during the build
ERROR: Connection was broken: java.io.IOException: Sorry, this connection is closed.
	at com.trilead.ssh2.transport.TransportManager.ensureConnected(TransportManager.java:587)
	at com.trilead.ssh2.transport.TransportManager.sendMessage(TransportManager.java:660)
	at com.trilead.ssh2.channel.Channel.freeupWindow(Channel.java:407)
	at com.trilead.ssh2.channel.Channel.freeupWindow(Channel.java:347)
	at com.trilead.ssh2.channel.ChannelManager.getChannelData(ChannelManager.java:943)
	at com.trilead.ssh2.channel.ChannelInputStream.read(ChannelInputStream.java:58)
	at com.trilead.ssh2.channel.ChannelInputStream.read(ChannelInputStream.java:79)
	at hudson.remoting.FlightRecorderInputStream.read(FlightRecorderInputStream.java:82)
	at hudson.remoting.ChunkedInputStream.readHeader(ChunkedInputStream.java:72)
	at hudson.remoting.ChunkedInputStream.readUntilBreak(ChunkedInputStream.java:103)
	at hudson.remoting.ChunkedCommandTransport.readBlock(ChunkedCommandTransport.java:39)
	at hudson.remoting.AbstractSynchronousByteArrayCommandTransport.read(AbstractSynchronousByteArrayCommandTransport.java:34)
	at hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:48)
Caused by: java.io.IOException: Error: the peer is not consuming our asynchronous replies.
	at com.trilead.ssh2.transport.TransportManager.sendAsynchronousMessage(TransportManager.java:628)
	at com.trilead.ssh2.channel.Channel.freeupWindow(Channel.java:405)
	at com.trilead.ssh2.channel.Channel$Output.write(Channel.java:99)
	at com.trilead.ssh2.channel.ChannelManager.msgChannelExtendedData(ChannelManager.java:848)
	at com.trilead.ssh2.channel.ChannelManager.handleMessage(ChannelManager.java:1463)
	at com.trilead.ssh2.transport.TransportManager.receiveLoop(TransportManager.java:796)
	at com.trilead.ssh2.transport.TransportManager$1.run(TransportManager.java:489)
	at java.lang.Thread.run(Thread.java:745)

Build step 'Execute shell' marked build as failure
ERROR: Step ?Publish FindBugs analysis results? failed: no workspace for Hadoop-Mapreduce-trunk-Java8 #1402
ERROR: Step ?Archive the artifacts? failed: no workspace for Hadoop-Mapreduce-trunk-Java8 #1402
ERROR: Step ?Publish JUnit test result report? failed: no workspace for Hadoop-Mapreduce-trunk-Java8 #1402
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
ERROR: H1 is offline; cannot locate jdk-1.8.0
ERROR: H1 is offline; cannot locate jdk-1.8.0




###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Mapreduce-trunk-Java8 - Build # 1401 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1401/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9514 lines...]
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.685 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.512 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.824 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 533, Failures: 3, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.334 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.846 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.326 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:14 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:23 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:44 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:02 h
[INFO] Finished at: 2016-05-14T22:22:10+00:00
[INFO] Final Memory: 34M/153M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1400 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1400/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8433 lines...]
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.945 sec - in org.apache.hadoop.mapred.TestJobEndNotifier
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestOldMethodsJobID
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.262 sec - in org.apache.hadoop.mapred.TestOldMethodsJobID
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.095 sec - in org.apache.hadoop.mapred.TestJobQueueClient
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobConf
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.481 sec - in org.apache.hadoop.mapred.TestJobConf

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 244, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.337 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:39 min
[INFO] Finished at: 2016-05-13T21:20:57+00:00
[INFO] Final Memory: 32M/224M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)




Hadoop-Mapreduce-trunk-Java8 - Build # 1399 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1399/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8806 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.248 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.001 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.735 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher

Results :

Failed tests: 
  TestRecovery.testOutputRecoveryMapsOnly:995 Task state is not correct (timedout) expected:<SUCCEEDED> but was:<RUNNING>
  TestRecovery.testRecoverySuccessUsingCustomOutputCommitter:537 Task state is not correct (timedout) expected:<SUCCEEDED> but was:<RUNNING>
  TestRecovery.testCrashed:262 Task state is not correct (timedout) expected:<SUCCEEDED> but was:<RUNNING>
  TestRecovery.testSpeculative:1256 Task state is not correct (timedout) expected:<SUCCEEDED> but was:<RUNNING>

Tests run: 348, Failures: 4, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.552 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:39 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 26.985 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.443 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [11:00 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13:15 min
[INFO] Finished at: 2016-05-13T17:38:01+00:00
[INFO] Final Memory: 36M/190M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.TestRecovery.testOutputRecoveryMapsOnly

Error Message:
Task state is not correct (timedout) expected:<SUCCEEDED> but was:<RUNNING>

Stack Trace:
java.lang.AssertionError: Task state is not correct (timedout) expected:<SUCCEEDED> but was:<RUNNING>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.waitForState(MRApp.java:399)
	at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testOutputRecoveryMapsOnly(TestRecovery.java:995)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestRecovery.testRecoverySuccessUsingCustomOutputCommitter

Error Message:
Task state is not correct (timedout) expected:<SUCCEEDED> but was:<RUNNING>

Stack Trace:
java.lang.AssertionError: Task state is not correct (timedout) expected:<SUCCEEDED> but was:<RUNNING>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.waitForState(MRApp.java:399)
	at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testRecoverySuccessUsingCustomOutputCommitter(TestRecovery.java:537)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestRecovery.testCrashed

Error Message:
Task state is not correct (timedout) expected:<SUCCEEDED> but was:<RUNNING>

Stack Trace:
java.lang.AssertionError: Task state is not correct (timedout) expected:<SUCCEEDED> but was:<RUNNING>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.waitForState(MRApp.java:399)
	at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testCrashed(TestRecovery.java:262)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestRecovery.testSpeculative

Error Message:
Task state is not correct (timedout) expected:<SUCCEEDED> but was:<RUNNING>

Stack Trace:
java.lang.AssertionError: Task state is not correct (timedout) expected:<SUCCEEDED> but was:<RUNNING>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.waitForState(MRApp.java:399)
	at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testSpeculative(TestRecovery.java:1256)




Hadoop-Mapreduce-trunk-Java8 - Build # 1398 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1398/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9559 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.7 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.62 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.742 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 529, Failures: 3, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.573 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:40 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 25.823 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.502 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:39 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:25 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:40 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:00 h
[INFO] Finished at: 2016-05-13T17:19:41+00:00
[INFO] Final Memory: 34M/155M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter4295745490478206475.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire8605929818491983095tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_2985322543892716641911tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1397 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1397/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9514 lines...]
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.778 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.576 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.937 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 533, Failures: 3, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.206 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:35 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.558 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.217 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:15 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:19 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:47 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:06 h
[INFO] Finished at: 2016-05-13T12:26:08+00:00
[INFO] Final Memory: 34M/189M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1396 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1396/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8435 lines...]
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.982 sec - in org.apache.hadoop.mapred.TestJobAclsManager
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestLineRecordReader
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.799 sec - in org.apache.hadoop.mapred.TestLineRecordReader
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestClock
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.091 sec - in org.apache.hadoop.mapred.TestClock
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.097 sec - in org.apache.hadoop.mapred.TestJobQueueClient

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 242, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.296 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:47 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:51 min
[INFO] Finished at: 2016-05-13T07:21:38+00:00
[INFO] Final Memory: 32M/185M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)




Hadoop-Mapreduce-trunk-Java8 - Build # 1395 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1395/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9502 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.pipes.TestPipes
Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.029 sec - in org.apache.hadoop.mapred.pipes.TestPipes
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.639 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestReporter
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.224 sec - in org.apache.hadoop.mapred.TestReporter

Results :

Failed tests: 
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 533, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.382 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:52 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 30.067 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.259 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:13 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:38 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:50 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:11 h
[INFO] Finished at: 2016-05-13T04:31:01+00:00
[INFO] Final Memory: 34M/147M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1394 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1394/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9505 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJavaSerialization
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.011 sec - in org.apache.hadoop.mapred.TestJavaSerialization
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.365 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.ipc.TestMRCJCSocketFactory
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.118 sec - in org.apache.hadoop.ipc.TestMRCJCSocketFactory

Results :

Failed tests: 
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 533, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.565 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.980 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.351 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:16 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:20 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:42 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:01 h
[INFO] Finished at: 2016-05-13T01:43:22+00:00
[INFO] Final Memory: 34M/150M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1393 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1393/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8783 lines...]
	at org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler.setUpClass(TestJobHistoryEventHandler.java:93)

org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler  Time elapsed: 18.503 sec  <<< ERROR!
java.lang.NullPointerException: null
	at org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler.cleanUpClass(TestJobHistoryEventHandler.java:98)

Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestEvents
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.072 sec - in org.apache.hadoop.mapreduce.jobhistory.TestEvents

Results :

Tests in error: 
  TestJobHistoryEventHandler.setUpClass:93 » IO Timed out waiting for Mini HDFS ...
  TestJobHistoryEventHandler.cleanUpClass:98 NullPointer

Tests run: 337, Failures: 0, Errors: 2, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  4.739 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:32 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 38.283 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.566 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [12:49 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 16:13 min
[INFO] Finished at: 2016-05-12T22:35:33+00:00
[INFO] Final Memory: 36M/195M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler.org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler

Error Message:
Timed out waiting for Mini HDFS Cluster to start

Stack Trace:
java.io.IOException: Timed out waiting for Mini HDFS Cluster to start
	at org.apache.hadoop.hdfs.MiniDFSCluster.waitClusterUp(MiniDFSCluster.java:1345)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:848)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:482)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:441)
	at org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler.setUpClass(TestJobHistoryEventHandler.java:93)


FAILED:  org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler.org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler

Error Message:
null

Stack Trace:
java.lang.NullPointerException: null
	at org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler.cleanUpClass(TestJobHistoryEventHandler.java:98)




Hadoop-Mapreduce-trunk-Java8 - Build # 1392 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1392/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8781 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.282 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.004 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.651 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher

Results :

Tests in error: 
  TestJobHistoryEventHandler.setUpClass:93 » IO Timed out waiting for Mini HDFS ...
  TestJobHistoryEventHandler.cleanUpClass:98 NullPointer

Tests run: 337, Failures: 0, Errors: 2, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.065 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:35 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.343 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.093 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [09:53 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 12:00 min
[INFO] Finished at: 2016-05-12T20:31:28+00:00
[INFO] Final Memory: 35M/220M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler.org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler

Error Message:
Timed out waiting for Mini HDFS Cluster to start

Stack Trace:
java.io.IOException: Timed out waiting for Mini HDFS Cluster to start
	at org.apache.hadoop.hdfs.MiniDFSCluster.waitClusterUp(MiniDFSCluster.java:1345)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:848)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:482)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:441)
	at org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler.setUpClass(TestJobHistoryEventHandler.java:93)


FAILED:  org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler.org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler

Error Message:
null

Stack Trace:
java.lang.NullPointerException: null
	at org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler.cleanUpClass(TestJobHistoryEventHandler.java:98)




Hadoop-Mapreduce-trunk-Java8 - Build # 1391 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1391/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9507 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.633 sec - in org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.57 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.668 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 533, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.068 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:49 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 26.652 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.890 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:31 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:37 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:48 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:07 h
[INFO] Finished at: 2016-05-12T19:09:52+00:00
[INFO] Final Memory: 34M/148M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1390 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1390/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8434 lines...]
Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.053 sec - in org.apache.hadoop.mapreduce.lib.output.TestFileOutputCommitter
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.output.TestMapFileOutputFormat
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.643 sec - in org.apache.hadoop.mapreduce.lib.output.TestMapFileOutputFormat
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.output.TestPreemptableFileOutputCommitter
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.9 sec - in org.apache.hadoop.mapreduce.lib.output.TestPreemptableFileOutputCommitter
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.output.TestFileOutputFormat
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.948 sec - in org.apache.hadoop.mapreduce.lib.output.TestFileOutputFormat

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 242, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.922 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [02:29 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:34 min
[INFO] Finished at: 2016-05-12T13:28:16+00:00
[INFO] Final Memory: 32M/203M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)




Hadoop-Mapreduce-trunk-Java8 - Build # 1389 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1389/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8942 lines...]
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.097 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.951 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher

Results :

Tests in error: 
  TestStagingCleanup.testStagingCleanupOrder:469 » NoClassDefFound org/apache/ha...
  TestKill.testKillTaskAttempt:363 » NoClassDefFound org/apache/hadoop/yarn/even...
  TestKill.testKillTaskWaitKillJobAfterTA_DONE:221 » Metrics Metrics source MRAp...
  TestKill.testKillJob:63 » Metrics Metrics source MRAppMetrics already exists!
  TestKill.testKillTaskWaitKillJobBeforeTA_DONE:269 » Metrics Metrics source MRA...
  TestKill.testKillTask:98 » Metrics Metrics source MRAppMetrics already exists!
  TestKill.testKillTaskWait:183 » Metrics Metrics source MRAppMetrics already ex...

Tests run: 325, Failures: 0, Errors: 6, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.427 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:40 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.904 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.404 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [08:44 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 10:56 min
[INFO] Finished at: 2016-05-12T10:43:38+00:00
[INFO] Final Memory: 35M/207M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire/surefirebooter8640392248720330213.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire/surefire6087002405944488634tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire/surefire_782159756869816404064tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.TestKill.testKillTaskAttempt

Error Message:
org/apache/hadoop/yarn/event/AsyncDispatcher$1

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/event/AsyncDispatcher$1
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.yarn.event.AsyncDispatcher.createThread(AsyncDispatcher.java:85)
	at org.apache.hadoop.yarn.event.AsyncDispatcher.serviceStart(AsyncDispatcher.java:129)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceStart(MRAppMaster.java:1226)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:301)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.TestKill.testKillTaskAttempt(TestKill.java:363)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestKill.testKillTaskWaitKillJobAfterTA_DONE

Error Message:
Metrics source MRAppMetrics already exists!

Stack Trace:
org.apache.hadoop.metrics2.MetricsException: Metrics source MRAppMetrics already exists!
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:143)
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:120)
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:229)
	at org.apache.hadoop.metrics2.MetricsSystem.register(MetricsSystem.java:71)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:59)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:54)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.<init>(MRAppMaster.java:263)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:235)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:212)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:193)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:154)
	at org.apache.hadoop.mapreduce.v2.app.TestKill$3.<init>(TestKill.java:221)
	at org.apache.hadoop.mapreduce.v2.app.TestKill.testKillTaskWaitKillJobAfterTA_DONE(TestKill.java:221)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob

Error Message:
Metrics source MRAppMetrics already exists!

Stack Trace:
org.apache.hadoop.metrics2.MetricsException: Metrics source MRAppMetrics already exists!
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:143)
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:120)
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:229)
	at org.apache.hadoop.metrics2.MetricsSystem.register(MetricsSystem.java:71)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:59)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:54)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.<init>(MRAppMaster.java:263)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:235)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:212)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:193)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:154)
	at org.apache.hadoop.mapreduce.v2.app.TestKill$BlockingMRApp.<init>(TestKill.java:417)
	at org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob(TestKill.java:63)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestKill.testKillTaskWaitKillJobBeforeTA_DONE

Error Message:
Metrics source MRAppMetrics already exists!

Stack Trace:
org.apache.hadoop.metrics2.MetricsException: Metrics source MRAppMetrics already exists!
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:143)
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:120)
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:229)
	at org.apache.hadoop.metrics2.MetricsSystem.register(MetricsSystem.java:71)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:59)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:54)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.<init>(MRAppMaster.java:263)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:235)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:212)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:193)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:154)
	at org.apache.hadoop.mapreduce.v2.app.TestKill$4.<init>(TestKill.java:269)
	at org.apache.hadoop.mapreduce.v2.app.TestKill.testKillTaskWaitKillJobBeforeTA_DONE(TestKill.java:269)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestKill.testKillTask

Error Message:
Metrics source MRAppMetrics already exists!

Stack Trace:
org.apache.hadoop.metrics2.MetricsException: Metrics source MRAppMetrics already exists!
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:143)
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:120)
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:229)
	at org.apache.hadoop.metrics2.MetricsSystem.register(MetricsSystem.java:71)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:59)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:54)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.<init>(MRAppMaster.java:263)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:235)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:212)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:193)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:154)
	at org.apache.hadoop.mapreduce.v2.app.TestKill$BlockingMRApp.<init>(TestKill.java:417)
	at org.apache.hadoop.mapreduce.v2.app.TestKill.testKillTask(TestKill.java:98)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestKill.testKillTaskWait

Error Message:
Metrics source MRAppMetrics already exists!

Stack Trace:
org.apache.hadoop.metrics2.MetricsException: Metrics source MRAppMetrics already exists!
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:143)
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:120)
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:229)
	at org.apache.hadoop.metrics2.MetricsSystem.register(MetricsSystem.java:71)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:59)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:54)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.<init>(MRAppMaster.java:263)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:235)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:212)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:193)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:154)
	at org.apache.hadoop.mapreduce.v2.app.TestKill$2.<init>(TestKill.java:183)
	at org.apache.hadoop.mapreduce.v2.app.TestKill.testKillTaskWait(TestKill.java:183)




Hadoop-Mapreduce-trunk-Java8 - Build # 1388 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1388/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9269 lines...]
  TestHsWebServicesAcls.testGetJobConfAcls:138 » NoClassDefFound Could not initi...
  TestHsWebServicesAcls.testGetJobTaskAcls:172 » NoClassDefFound Could not initi...
  TestHsWebServicesAcls.testGetJobTasksAcls:155 » NoClassDefFound Could not init...
  TestHsWebServicesAcls.testGetJobTaskAttemptIdCountersAcls:242 » NoClassDefFound
  TestJobHistoryEntities.testCopmletedJobReportWithZeroTasks:125 » NoClassDefFound
  TestJobHistoryEntities.testGetTaskAttemptCompletionEvent:210 » NoClassDefFound
  TestJobHistoryEntities.testCompletedTaskAttempt:171 » NoClassDefFound org/apac...
  TestJobHistoryEntities.testCompletedJob:99 » NoClassDefFound org/apache/hadoop...
  TestJobHistoryEntities.testCompletedJobWithDiagnostics:264 » NoClassDefFound o...
  TestJobHistoryEntities.testCompletedTask:142 » NoClassDefFound org/apache/hado...
  TestJobHistoryEntities.testCopmletedJobReportWithZeroTasks:125 » NoClassDefFound
  TestJobHistoryEntities.testGetTaskAttemptCompletionEvent:210 » NoClassDefFound
  TestJobHistoryEntities.testCompletedTaskAttempt:171 » NoClassDefFound org/apac...
  TestJobHistoryEntities.testCompletedJob:99 » NoClassDefFound org/apache/hadoop...
  TestJobHistoryEntities.testCompletedJobWithDiagnostics:264 » NoClassDefFound o...
  TestJobHistoryEntities.testCompletedTask:142 » NoClassDefFound org/apache/hado...

Tests run: 207, Failures: 8, Errors: 21, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.412 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 25.213 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.639 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:32 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. FAILURE [06:05 min]
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 18:49 min
[INFO] Finished at: 2016-05-12T03:15:11+00:00
[INFO] Final Memory: 38M/235M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-hs: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-hs && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-hs/target/surefire/surefirebooter6451416354686986110.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-hs/target/surefire/surefire3460155293781293656tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-hs/target/surefire/surefire_133938803200588785492tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-hs
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
29 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEntities.testCopmletedJobReportWithZeroTasks[0]

Error Message:
org/apache/hadoop/yarn/util/ConverterUtils

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/util/ConverterUtils
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.mapreduce.jobhistory.AMStartedEvent.getAppAttemptId(AMStartedEvent.java:110)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.handleAMStartedEvent(JobHistoryParser.java:414)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.handleEvent(JobHistoryParser.java:237)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:112)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:153)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:139)
	at org.apache.hadoop.mapreduce.v2.hs.CompletedJob.loadFullHistoryData(CompletedJob.java:369)
	at org.apache.hadoop.mapreduce.v2.hs.CompletedJob.<init>(CompletedJob.java:105)
	at org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEntities.testCopmletedJobReportWithZeroTasks(TestJobHistoryEntities.java:125)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEntities.testGetTaskAttemptCompletionEvent[0]

Error Message:
org/apache/hadoop/yarn/util/ConverterUtils

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/util/ConverterUtils
	at org.apache.hadoop.mapreduce.jobhistory.AMStartedEvent.getAppAttemptId(AMStartedEvent.java:110)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.handleAMStartedEvent(JobHistoryParser.java:414)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.handleEvent(JobHistoryParser.java:237)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:112)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:153)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:139)
	at org.apache.hadoop.mapreduce.v2.hs.CompletedJob.loadFullHistoryData(CompletedJob.java:369)
	at org.apache.hadoop.mapreduce.v2.hs.CompletedJob.<init>(CompletedJob.java:105)
	at org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEntities.testGetTaskAttemptCompletionEvent(TestJobHistoryEntities.java:210)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEntities.testCompletedTaskAttempt[0]

Error Message:
org/apache/hadoop/yarn/util/ConverterUtils

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/util/ConverterUtils
	at org.apache.hadoop.mapreduce.jobhistory.AMStartedEvent.getAppAttemptId(AMStartedEvent.java:110)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.handleAMStartedEvent(JobHistoryParser.java:414)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.handleEvent(JobHistoryParser.java:237)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:112)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:153)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:139)
	at org.apache.hadoop.mapreduce.v2.hs.CompletedJob.loadFullHistoryData(CompletedJob.java:369)
	at org.apache.hadoop.mapreduce.v2.hs.CompletedJob.<init>(CompletedJob.java:105)
	at org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEntities.testCompletedTaskAttempt(TestJobHistoryEntities.java:171)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEntities.testCompletedJob[0]

Error Message:
org/apache/hadoop/yarn/util/ConverterUtils

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/util/ConverterUtils
	at org.apache.hadoop.mapreduce.jobhistory.AMStartedEvent.getAppAttemptId(AMStartedEvent.java:110)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.handleAMStartedEvent(JobHistoryParser.java:414)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.handleEvent(JobHistoryParser.java:237)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:112)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:153)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:139)
	at org.apache.hadoop.mapreduce.v2.hs.CompletedJob.loadFullHistoryData(CompletedJob.java:369)
	at org.apache.hadoop.mapreduce.v2.hs.CompletedJob.<init>(CompletedJob.java:105)
	at org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEntities.testCompletedJob(TestJobHistoryEntities.java:99)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEntities.testCompletedJobWithDiagnostics[0]

Error Message:
org/apache/hadoop/yarn/api/records/impl/pb/ProtoBase

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/api/records/impl/pb/ProtoBase
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:340)
	at org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:2211)
	at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2176)
	at org.apache.hadoop.yarn.factories.impl.pb.RecordFactoryPBImpl.newRecordInstance(RecordFactoryPBImpl.java:56)
	at org.apache.hadoop.yarn.util.Records.newRecord(Records.java:36)
	at org.apache.hadoop.mapreduce.v2.hs.CompletedJob.constructJobReport(CompletedJob.java:137)
	at org.apache.hadoop.mapreduce.v2.hs.CompletedJob.getReport(CompletedJob.java:131)
	at org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEntities.testCompletedJobWithDiagnostics(TestJobHistoryEntities.java:264)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEntities.testCompletedTask[0]

Error Message:
org/apache/hadoop/yarn/util/ConverterUtils

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/util/ConverterUtils
	at org.apache.hadoop.mapreduce.jobhistory.AMStartedEvent.getAppAttemptId(AMStartedEvent.java:110)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.handleAMStartedEvent(JobHistoryParser.java:414)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.handleEvent(JobHistoryParser.java:237)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:112)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:153)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:139)
	at org.apache.hadoop.mapreduce.v2.hs.CompletedJob.loadFullHistoryData(CompletedJob.java:369)
	at org.apache.hadoop.mapreduce.v2.hs.CompletedJob.<init>(CompletedJob.java:105)
	at org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEntities.testCompletedTask(TestJobHistoryEntities.java:142)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEntities.testCopmletedJobReportWithZeroTasks[1]

Error Message:
org/apache/hadoop/yarn/util/ConverterUtils

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/util/ConverterUtils
	at org.apache.hadoop.mapreduce.jobhistory.AMStartedEvent.getAppAttemptId(AMStartedEvent.java:110)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.handleAMStartedEvent(JobHistoryParser.java:414)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.handleEvent(JobHistoryParser.java:237)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:112)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:153)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:139)
	at org.apache.hadoop.mapreduce.v2.hs.CompletedJob.loadFullHistoryData(CompletedJob.java:369)
	at org.apache.hadoop.mapreduce.v2.hs.CompletedJob.<init>(CompletedJob.java:105)
	at org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEntities.testCopmletedJobReportWithZeroTasks(TestJobHistoryEntities.java:125)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEntities.testGetTaskAttemptCompletionEvent[1]

Error Message:
org/apache/hadoop/yarn/util/ConverterUtils

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/util/ConverterUtils
	at org.apache.hadoop.mapreduce.jobhistory.AMStartedEvent.getAppAttemptId(AMStartedEvent.java:110)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.handleAMStartedEvent(JobHistoryParser.java:414)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.handleEvent(JobHistoryParser.java:237)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:112)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:153)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:139)
	at org.apache.hadoop.mapreduce.v2.hs.CompletedJob.loadFullHistoryData(CompletedJob.java:369)
	at org.apache.hadoop.mapreduce.v2.hs.CompletedJob.<init>(CompletedJob.java:105)
	at org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEntities.testGetTaskAttemptCompletionEvent(TestJobHistoryEntities.java:210)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEntities.testCompletedTaskAttempt[1]

Error Message:
org/apache/hadoop/yarn/util/ConverterUtils

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/util/ConverterUtils
	at org.apache.hadoop.mapreduce.jobhistory.AMStartedEvent.getAppAttemptId(AMStartedEvent.java:110)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.handleAMStartedEvent(JobHistoryParser.java:414)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.handleEvent(JobHistoryParser.java:237)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:112)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:153)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:139)
	at org.apache.hadoop.mapreduce.v2.hs.CompletedJob.loadFullHistoryData(CompletedJob.java:369)
	at org.apache.hadoop.mapreduce.v2.hs.CompletedJob.<init>(CompletedJob.java:105)
	at org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEntities.testCompletedTaskAttempt(TestJobHistoryEntities.java:171)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEntities.testCompletedJob[1]

Error Message:
org/apache/hadoop/yarn/util/ConverterUtils

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/util/ConverterUtils
	at org.apache.hadoop.mapreduce.jobhistory.AMStartedEvent.getAppAttemptId(AMStartedEvent.java:110)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.handleAMStartedEvent(JobHistoryParser.java:414)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.handleEvent(JobHistoryParser.java:237)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:112)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:153)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:139)
	at org.apache.hadoop.mapreduce.v2.hs.CompletedJob.loadFullHistoryData(CompletedJob.java:369)
	at org.apache.hadoop.mapreduce.v2.hs.CompletedJob.<init>(CompletedJob.java:105)
	at org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEntities.testCompletedJob(TestJobHistoryEntities.java:99)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEntities.testCompletedJobWithDiagnostics[1]

Error Message:
org/apache/hadoop/yarn/api/records/impl/pb/ProtoBase

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/api/records/impl/pb/ProtoBase
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:340)
	at org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:2211)
	at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2176)
	at org.apache.hadoop.yarn.factories.impl.pb.RecordFactoryPBImpl.newRecordInstance(RecordFactoryPBImpl.java:56)
	at org.apache.hadoop.yarn.util.Records.newRecord(Records.java:36)
	at org.apache.hadoop.mapreduce.v2.hs.CompletedJob.constructJobReport(CompletedJob.java:137)
	at org.apache.hadoop.mapreduce.v2.hs.CompletedJob.getReport(CompletedJob.java:131)
	at org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEntities.testCompletedJobWithDiagnostics(TestJobHistoryEntities.java:264)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEntities.testCompletedTask[1]

Error Message:
org/apache/hadoop/yarn/util/ConverterUtils

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/util/ConverterUtils
	at org.apache.hadoop.mapreduce.jobhistory.AMStartedEvent.getAppAttemptId(AMStartedEvent.java:110)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.handleAMStartedEvent(JobHistoryParser.java:414)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.handleEvent(JobHistoryParser.java:237)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:112)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:153)
	at org.apache.hadoop.mapreduce.jobhistory.JobHistoryParser.parse(JobHistoryParser.java:139)
	at org.apache.hadoop.mapreduce.v2.hs.CompletedJob.loadFullHistoryData(CompletedJob.java:369)
	at org.apache.hadoop.mapreduce.v2.hs.CompletedJob.<init>(CompletedJob.java:105)
	at org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEntities.testCompletedTask(TestJobHistoryEntities.java:142)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServices.testHSXML

Error Message:
expected:<application/xml> but was:<text/html; charset=ISO-8859-1>

Stack Trace:
java.lang.AssertionError: expected:<application/xml> but was:<text/html; charset=ISO-8859-1>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServices.testHSXML(TestHsWebServices.java:154)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServices.testInfo

Error Message:
expected:<application/json> but was:<text/html; charset=ISO-8859-1>

Stack Trace:
java.lang.AssertionError: expected:<application/json> but was:<text/html; charset=ISO-8859-1>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServices.testInfo(TestHsWebServices.java:165)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServices.testInfoDefault

Error Message:
expected:<application/json> but was:<text/html; charset=ISO-8859-1>

Stack Trace:
java.lang.AssertionError: expected:<application/json> but was:<text/html; charset=ISO-8859-1>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServices.testInfoDefault(TestHsWebServices.java:188)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServices.testHS

Error Message:
expected:<application/json> but was:<text/html; charset=ISO-8859-1>

Stack Trace:
java.lang.AssertionError: expected:<application/json> but was:<text/html; charset=ISO-8859-1>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServices.testHS(TestHsWebServices.java:121)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServices.testHSDefault

Error Message:
expected:<application/json> but was:<text/html; charset=ISO-8859-1>

Stack Trace:
java.lang.AssertionError: expected:<application/json> but was:<text/html; charset=ISO-8859-1>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServices.testHSDefault(TestHsWebServices.java:143)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServices.testHSSlash

Error Message:
expected:<application/json> but was:<text/html; charset=ISO-8859-1>

Stack Trace:
java.lang.AssertionError: expected:<application/json> but was:<text/html; charset=ISO-8859-1>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServices.testHSSlash(TestHsWebServices.java:132)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServices.testInfoXML

Error Message:
expected:<application/xml> but was:<text/html; charset=ISO-8859-1>

Stack Trace:
java.lang.AssertionError: expected:<application/xml> but was:<text/html; charset=ISO-8859-1>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServices.testInfoXML(TestHsWebServices.java:200)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServices.testInfoSlash

Error Message:
expected:<application/json> but was:<text/html; charset=ISO-8859-1>

Stack Trace:
java.lang.AssertionError: expected:<application/json> but was:<text/html; charset=ISO-8859-1>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServices.testInfoSlash(TestHsWebServices.java:177)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.testGetJobCountersAcls

Error Message:
org/apache/hadoop/util/PlatformName

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/util/PlatformName
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.security.UserGroupInformation.getOSLoginModuleName(UserGroupInformation.java:393)
	at org.apache.hadoop.security.UserGroupInformation.<clinit>(UserGroupInformation.java:438)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices.hasAccess(HsWebServices.java:90)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices.checkAccess(HsWebServices.java:97)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices.getJobCounters(HsWebServices.java:253)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.testGetJobCountersAcls(TestHsWebServicesAcls.java:121)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.testGetJobTaskAttemptsAcls

Error Message:
Could not initialize class org.apache.hadoop.security.UserGroupInformation

Stack Trace:
java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.security.UserGroupInformation
	at org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices.hasAccess(HsWebServices.java:90)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices.checkAccess(HsWebServices.java:97)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices.getJobTaskAttempts(HsWebServices.java:347)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.testGetJobTaskAttemptsAcls(TestHsWebServicesAcls.java:206)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.testGetJobTaskAttemptIdAcls

Error Message:
Could not initialize class org.apache.hadoop.security.UserGroupInformation

Stack Trace:
java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.security.UserGroupInformation
	at org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices.hasAccess(HsWebServices.java:90)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices.checkAccess(HsWebServices.java:97)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices.getJobTaskAttemptId(HsWebServices.java:370)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.testGetJobTaskAttemptIdAcls(TestHsWebServicesAcls.java:223)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.testGetSingleTaskCountersAcls

Error Message:
Could not initialize class org.apache.hadoop.security.UserGroupInformation

Stack Trace:
java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.security.UserGroupInformation
	at org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices.hasAccess(HsWebServices.java:90)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices.checkAccess(HsWebServices.java:97)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices.getSingleTaskCounters(HsWebServices.java:326)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.testGetSingleTaskCountersAcls(TestHsWebServicesAcls.java:189)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.testGetJobAcls

Error Message:
Could not initialize class org.apache.hadoop.security.UserGroupInformation

Stack Trace:
java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.security.UserGroupInformation
	at org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices.hasAccess(HsWebServices.java:90)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices.checkAccess(HsWebServices.java:97)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices.getJob(HsWebServices.java:224)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.testGetJobAcls(TestHsWebServicesAcls.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.testGetJobConfAcls

Error Message:
Could not initialize class org.apache.hadoop.security.UserGroupInformation

Stack Trace:
java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.security.UserGroupInformation
	at org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices.hasAccess(HsWebServices.java:90)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices.checkAccess(HsWebServices.java:97)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices.getJobConf(HsWebServices.java:265)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.testGetJobConfAcls(TestHsWebServicesAcls.java:138)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.testGetJobTaskAcls

Error Message:
Could not initialize class org.apache.hadoop.security.UserGroupInformation

Stack Trace:
java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.security.UserGroupInformation
	at org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices.hasAccess(HsWebServices.java:90)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices.checkAccess(HsWebServices.java:97)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices.getJobTask(HsWebServices.java:311)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.testGetJobTaskAcls(TestHsWebServicesAcls.java:172)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.testGetJobTasksAcls

Error Message:
Could not initialize class org.apache.hadoop.security.UserGroupInformation

Stack Trace:
java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.security.UserGroupInformation
	at org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices.hasAccess(HsWebServices.java:90)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices.checkAccess(HsWebServices.java:97)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices.getJobTasks(HsWebServices.java:284)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.testGetJobTasksAcls(TestHsWebServicesAcls.java:155)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.testGetJobTaskAttemptIdCountersAcls

Error Message:
Could not initialize class org.apache.hadoop.security.UserGroupInformation

Stack Trace:
java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.security.UserGroupInformation
	at org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices.hasAccess(HsWebServices.java:90)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices.checkAccess(HsWebServices.java:97)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.HsWebServices.getJobTaskAttemptIdCounters(HsWebServices.java:390)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.testGetJobTaskAttemptIdCountersAcls(TestHsWebServicesAcls.java:242)




Hadoop-Mapreduce-trunk-Java8 - Build # 1387 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1387/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9541 lines...]
Running org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.506 sec - in org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.673 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.667 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.
  TestClusterMRNotification>NotificationTestCase.testMR:205 expected:<6> but was:<4>

Tests run: 533, Failures: 3, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.444 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:40 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 25.677 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.312 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:12 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:24 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:47 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:06 h
[INFO] Finished at: 2016-05-12T01:17:57+00:00
[INFO] Final Memory: 34M/144M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestClusterMRNotification.testMR

Error Message:
expected:<6> but was:<4>

Stack Trace:
java.lang.AssertionError: expected:<6> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.junit.Assert.assertEquals(Assert.java:542)
	at org.apache.hadoop.mapred.NotificationTestCase.testMR(NotificationTestCase.java:205)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1386 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1386/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8921 lines...]
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 32.64 sec - in org.apache.hadoop.mapreduce.v2.hs.TestJobHistory
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEvents
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 49.047 sec - in org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEvents
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.hs.TestHistoryFileManager
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.802 sec - in org.apache.hadoop.mapreduce.v2.hs.TestHistoryFileManager
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.hs.TestJHSDelegationTokenSecretManager
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.621 sec - in org.apache.hadoop.mapreduce.v2.hs.TestJHSDelegationTokenSecretManager

Results :

Tests in error: 
  TestJobListCache.testAddExisting:39 »  test timed out after 1000 milliseconds

Tests run: 210, Failures: 0, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  4.085 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:31 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 36.711 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.618 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [13:32 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. FAILURE [08:43 min]
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 25:37 min
[INFO] Finished at: 2016-05-11T20:40:11+00:00
[INFO] Final Memory: 38M/193M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-hs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-hs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-hs
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.hs.TestJobListCache.testAddExisting

Error Message:
test timed out after 1000 milliseconds

Stack Trace:
java.lang.Exception: test timed out after 1000 milliseconds
	at org.mockito.asm.Type.getDescriptor(Type.java:515)
	at org.mockito.asm.Type.getDescriptor(Type.java:481)
	at org.mockito.cglib.core.CodeEmitter.emit_field(CodeEmitter.java:469)
	at org.mockito.cglib.core.CodeEmitter.getfield(CodeEmitter.java:426)
	at org.mockito.cglib.proxy.MethodInterceptorGenerator$2.processCase(MethodInterceptorGenerator.java:226)
	at org.mockito.cglib.core.EmitUtils$6.processCase(EmitUtils.java:294)
	at org.mockito.cglib.core.CodeEmitter.process_switch(CodeEmitter.java:641)
	at org.mockito.cglib.core.CodeEmitter.process_switch(CodeEmitter.java:603)
	at org.mockito.cglib.core.EmitUtils.string_switch_hash(EmitUtils.java:269)
	at org.mockito.cglib.core.EmitUtils.string_switch(EmitUtils.java:171)
	at org.mockito.cglib.proxy.MethodInterceptorGenerator.generateFindProxy(MethodInterceptorGenerator.java:234)
	at org.mockito.cglib.proxy.MethodInterceptorGenerator.generate(MethodInterceptorGenerator.java:134)
	at org.mockito.cglib.proxy.Enhancer.emitMethods(Enhancer.java:948)
	at org.mockito.cglib.proxy.Enhancer.generateClass(Enhancer.java:499)
	at org.mockito.cglib.core.DefaultGeneratorStrategy.generate(DefaultGeneratorStrategy.java:25)
	at org.mockito.cglib.core.AbstractClassGenerator.create(AbstractClassGenerator.java:217)
	at org.mockito.cglib.proxy.Enhancer.createHelper(Enhancer.java:378)
	at org.mockito.cglib.proxy.Enhancer.createClass(Enhancer.java:318)
	at org.mockito.internal.creation.jmock.ClassImposterizer.createProxyClass(ClassImposterizer.java:93)
	at org.mockito.internal.creation.jmock.ClassImposterizer.imposterise(ClassImposterizer.java:50)
	at org.mockito.internal.util.MockUtil.createMock(MockUtil.java:54)
	at org.mockito.internal.MockitoCore.mock(MockitoCore.java:45)
	at org.mockito.Mockito.mock(Mockito.java:921)
	at org.mockito.Mockito.mock(Mockito.java:816)
	at org.apache.hadoop.mapreduce.v2.hs.TestJobListCache.testAddExisting(TestJobListCache.java:39)




Hadoop-Mapreduce-trunk-Java8 - Build # 1385 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1385/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 15097 lines...]
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.589 sec - in org.apache.hadoop.mapreduce.lib.chain.TestChainErrors
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.75 sec - in org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.215 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.417 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 520, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.436 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:39 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 25.598 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.551 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:18 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:22 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:54 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:13 h
[INFO] Finished at: 2016-05-11T19:07:30+00:00
[INFO] Final Memory: 34M/146M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter3434796768488018636.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire7152649698736072668tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_2606034661737822835944tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1384 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1384/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9513 lines...]
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.618 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.55 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.947 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 533, Failures: 3, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.218 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 25.077 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.267 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:04 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:21 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:47 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:06 h
[INFO] Finished at: 2016-05-11T10:01:23+00:00
[INFO] Final Memory: 34M/142M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1383 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1383/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9372 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestGetSplitHosts
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.189 sec - in org.apache.hadoop.mapred.TestGetSplitHosts
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestFileOutputFormat
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.441 sec - in org.apache.hadoop.mapred.TestFileOutputFormat
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestLineRecordReaderJobs
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.197 sec - in org.apache.hadoop.mapred.TestLineRecordReaderJobs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobCleanup
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 191.648 sec - in org.apache.hadoop.mapred.TestJobCleanup
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobName

Results :

Tests run: 395, Failures: 0, Errors: 0, Skipped: 5

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.190 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.706 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.126 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:17 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:21 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:34 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:53 h
[INFO] Finished at: 2016-05-11T04:19:28+00:00
[INFO] Final Memory: 40M/197M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter3717974280280317428.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire5525979157370180783tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_1986739113073769896707tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed

Hadoop-Mapreduce-trunk-Java8 - Build # 1382 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1382/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8943 lines...]
Results :

Failed tests: 
  TestKill.testKillJob:84 Task state not correct expected:<KILLED> but was:<NEW>

Tests in error: 
  TestContainerLauncher.testPoolLimits:184 NoClassDefFound org/apache/hadoop/yar...
  TestContainerLauncher.testPoolSize:98 NoClassDefFound org/apache/hadoop/yarn/a...
  TestMapReduceChildJVM.testCommandLine:54 » NoClassDefFound org/apache/hadoop/y...
  TestMapReduceChildJVM.testReduceCommandLineWithSeparateShuffle:87->testReduceCommandLine:108 » Metrics
  TestMapReduceChildJVM.testCommandLineWithLog4JConifg:153 » Metrics Metrics sou...
  TestMapReduceChildJVM.testReduceCommandLine:102->testReduceCommandLine:108 » Metrics
  TestMapReduceChildJVM.testAutoHeapSizes:183->testAutoHeapSize:227 » Metrics Me...
  TestMapReduceChildJVM.testEnvironmentVariables:281 » Metrics Metrics source MR...
  TestMapReduceChildJVM.testReduceCommandLineWithSeparateCRLAShuffle:96->testReduceCommandLine:108 » Metrics
  TestJobHistoryEventHandler.testTimelineEventHandling:497 » YarnRuntime org.apa...

Tests run: 345, Failures: 1, Errors: 8, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  4.098 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 40.118 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.903 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [12:02 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 15:32 min
[INFO] Finished at: 2016-05-11T00:32:20+00:00
[INFO] Final Memory: 35M/208M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire/surefirebooter1838938381007601195.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire/surefire8942351105962014646tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire/surefire_866202727110070477838tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
9 tests failed.
FAILED:  org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler.testTimelineEventHandling

Error Message:
org.apache.hadoop.yarn.webapp.WebAppException: Error starting http server

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: org.apache.hadoop.yarn.webapp.WebAppException: Error starting http server
	at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:866)
	at org.apache.hadoop.yarn.webapp.WebApps$Builder.start(WebApps.java:348)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startWepApp(ResourceManager.java:987)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1089)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.startResourceManager(MiniYARNCluster.java:335)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.access$300(MiniYARNCluster.java:112)
	at org.apache.hadoop.yarn.server.MiniYARNCluster$ResourceManagerWrapper.serviceStart(MiniYARNCluster.java:464)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.serviceStart(MiniYARNCluster.java:292)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler.testTimelineEventHandling(TestJobHistoryEventHandler.java:497)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob

Error Message:
Task state not correct expected:<KILLED> but was:<NEW>

Stack Trace:
java.lang.AssertionError: Task state not correct expected:<KILLED> but was:<NEW>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob(TestKill.java:84)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testCommandLine

Error Message:
org/apache/hadoop/yarn/event/AsyncDispatcher$1

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/event/AsyncDispatcher$1
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.yarn.event.AsyncDispatcher.createThread(AsyncDispatcher.java:85)
	at org.apache.hadoop.yarn.event.AsyncDispatcher.serviceStart(AsyncDispatcher.java:129)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceStart(MRAppMaster.java:1226)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:301)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.submit(MRApp.java:285)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testCommandLine(TestMapReduceChildJVM.java:54)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testReduceCommandLineWithSeparateShuffle

Error Message:
Metrics source MRAppMetrics already exists!

Stack Trace:
org.apache.hadoop.metrics2.MetricsException: Metrics source MRAppMetrics already exists!
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:143)
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:120)
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:230)
	at org.apache.hadoop.metrics2.MetricsSystem.register(MetricsSystem.java:71)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:59)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:54)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.<init>(MRAppMaster.java:263)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:235)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:212)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:193)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:154)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM$MyMRApp.<init>(TestMapReduceChildJVM.java:256)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testReduceCommandLine(TestMapReduceChildJVM.java:108)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testReduceCommandLineWithSeparateShuffle(TestMapReduceChildJVM.java:87)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testCommandLineWithLog4JConifg

Error Message:
Metrics source MRAppMetrics already exists!

Stack Trace:
org.apache.hadoop.metrics2.MetricsException: Metrics source MRAppMetrics already exists!
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:143)
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:120)
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:230)
	at org.apache.hadoop.metrics2.MetricsSystem.register(MetricsSystem.java:71)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:59)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:54)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.<init>(MRAppMaster.java:263)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:235)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:212)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:193)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:154)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM$MyMRApp.<init>(TestMapReduceChildJVM.java:256)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testCommandLineWithLog4JConifg(TestMapReduceChildJVM.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testReduceCommandLine

Error Message:
Metrics source MRAppMetrics already exists!

Stack Trace:
org.apache.hadoop.metrics2.MetricsException: Metrics source MRAppMetrics already exists!
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:143)
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:120)
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:230)
	at org.apache.hadoop.metrics2.MetricsSystem.register(MetricsSystem.java:71)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:59)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:54)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.<init>(MRAppMaster.java:263)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:235)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:212)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:193)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:154)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM$MyMRApp.<init>(TestMapReduceChildJVM.java:256)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testReduceCommandLine(TestMapReduceChildJVM.java:108)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testReduceCommandLine(TestMapReduceChildJVM.java:102)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testAutoHeapSizes

Error Message:
Metrics source MRAppMetrics already exists!

Stack Trace:
org.apache.hadoop.metrics2.MetricsException: Metrics source MRAppMetrics already exists!
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:143)
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:120)
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:230)
	at org.apache.hadoop.metrics2.MetricsSystem.register(MetricsSystem.java:71)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:59)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:54)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.<init>(MRAppMaster.java:263)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:235)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:212)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:193)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:154)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM$MyMRApp.<init>(TestMapReduceChildJVM.java:256)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testAutoHeapSize(TestMapReduceChildJVM.java:227)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testAutoHeapSizes(TestMapReduceChildJVM.java:183)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testEnvironmentVariables

Error Message:
Metrics source MRAppMetrics already exists!

Stack Trace:
org.apache.hadoop.metrics2.MetricsException: Metrics source MRAppMetrics already exists!
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:143)
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:120)
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:230)
	at org.apache.hadoop.metrics2.MetricsSystem.register(MetricsSystem.java:71)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:59)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:54)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.<init>(MRAppMaster.java:263)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:235)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:212)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:193)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:154)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM$MyMRApp.<init>(TestMapReduceChildJVM.java:256)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testEnvironmentVariables(TestMapReduceChildJVM.java:281)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testReduceCommandLineWithSeparateCRLAShuffle

Error Message:
Metrics source MRAppMetrics already exists!

Stack Trace:
org.apache.hadoop.metrics2.MetricsException: Metrics source MRAppMetrics already exists!
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.newSourceName(DefaultMetricsSystem.java:143)
	at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.sourceName(DefaultMetricsSystem.java:120)
	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:230)
	at org.apache.hadoop.metrics2.MetricsSystem.register(MetricsSystem.java:71)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:59)
	at org.apache.hadoop.mapreduce.v2.app.metrics.MRAppMetrics.create(MRAppMetrics.java:54)
	at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.<init>(MRAppMaster.java:263)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:235)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:212)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:193)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:154)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM$MyMRApp.<init>(TestMapReduceChildJVM.java:256)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testReduceCommandLine(TestMapReduceChildJVM.java:108)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestMapReduceChildJVM.testReduceCommandLineWithSeparateCRLAShuffle(TestMapReduceChildJVM.java:96)




Hadoop-Mapreduce-trunk-Java8 - Build # 1381 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1381/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9422 lines...]
Tests run: 3, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 1.181 sec <<< FAILURE! - in org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter
testAbort(org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter)  Time elapsed: 0.883 sec  <<< FAILURE!
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)

Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestMRIntermediateDataEncryption

Results :

Failed tests: 
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 377, Failures: 1, Errors: 0, Skipped: 6

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.617 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:01 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 26.110 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.531 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:59 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:45 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:35 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:55 h
[INFO] Finished at: 2016-05-10T22:39:36+00:00
[INFO] Final Memory: 43M/160M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter6393199662187103042.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire6226239566294501990tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_200418486021923327413tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)




Hadoop-Mapreduce-trunk-Java8 - Build # 1380 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1380/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9523 lines...]
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.94 sec - in org.apache.hadoop.mapreduce.lib.chain.TestChainErrors
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.423 sec - in org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.223 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.879 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 532, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.201 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.561 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.242 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:08 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:20 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:57 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:16 h
[INFO] Finished at: 2016-05-10T19:32:07+00:00
[INFO] Final Memory: 34M/145M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter3222537071942885708.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire6494157958975856980tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_2434653476917843992948tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1379 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1379/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9503 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.24 sec - in org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.303 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.478 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 533, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.736 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:40 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 25.052 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.397 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:12 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:10 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:43 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:01 h
[INFO] Finished at: 2016-05-10T09:02:49+00:00
[INFO] Final Memory: 34M/143M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1378 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1378/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8786 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.519 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 30.67 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestEvents
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.083 sec - in org.apache.hadoop.mapreduce.jobhistory.TestEvents

Results :

Failed tests: 
  TestKill.testKillJob:84 Task state not correct expected:<KILLED> but was:<NEW>
  TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned:190->testTaskAttemptAssignedKilledHistory:403 No Ta Started JH Event

Tests run: 348, Failures: 2, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.823 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:34 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 50.989 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  8.978 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [13:40 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 17:20 min
[INFO] Finished at: 2016-05-09T23:39:44+00:00
[INFO] Final Memory: 36M/203M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob

Error Message:
Task state not correct expected:<KILLED> but was:<NEW>

Stack Trace:
java.lang.AssertionError: Task state not correct expected:<KILLED> but was:<NEW>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob(TestKill.java:84)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned

Error Message:
No Ta Started JH Event

Stack Trace:
java.lang.AssertionError: No Ta Started JH Event
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testTaskAttemptAssignedKilledHistory(TestTaskAttempt.java:403)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned(TestTaskAttempt.java:190)




Hadoop-Mapreduce-trunk-Java8 - Build # 1377 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1377/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9535 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.434 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.505 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests in error: 
  TestLazyOutput.testLazyOutput:196 » NoClassDefFound org/apache/hadoop/util/Shu...

Tests run: 533, Failures: 2, Errors: 1, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.468 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:53 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 25.819 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.380 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:19 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:08 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:46 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:05 h
[INFO] Finished at: 2016-05-09T01:24:49+00:00
[INFO] Final Memory: 34M/152M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestLazyOutput.testLazyOutput

Error Message:
org/apache/hadoop/util/ShutdownThreadsHelper

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/util/ShutdownThreadsHelper
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.serviceStop(HistoryFileManager.java:681)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistory.serviceStop(JobHistory.java:172)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:52)
	at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:80)
	at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
	at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.serviceStop(JobHistoryServer.java:208)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster$JobHistoryServerWrapper.serviceStop(MiniMRYarnCluster.java:257)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:52)
	at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:80)
	at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
	at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapred.MiniMRYarnClusterAdapter.stop(MiniMRYarnClusterAdapter.java:55)
	at org.apache.hadoop.mapred.MiniMRCluster.shutdown(MiniMRCluster.java:267)
	at org.apache.hadoop.mapred.TestLazyOutput.testLazyOutput(TestLazyOutput.java:196)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1376 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1376/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9502 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.261 sec - in org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.139 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.504 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 533, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.705 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:54 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.184 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.616 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:16 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:40 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:46 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:06 h
[INFO] Finished at: 2016-05-08T20:27:15+00:00
[INFO] Final Memory: 34M/133M
[INFO] ------------------------------------------------------------------------
[WARNING] The requested profile "parallel-tests" could not be activated because it does not exist.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1375 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1375/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9503 lines...]
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.167 sec - in org.apache.hadoop.mapreduce.lib.chain.TestChainErrors
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.255 sec - in org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.218 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.512 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 533, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.694 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:52 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 27.626 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.437 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:59 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:23 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:44 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:04 h
[INFO] Finished at: 2016-05-07T09:22:23+00:00
[INFO] Final Memory: 34M/136M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1374 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1374/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8431 lines...]
Running org.apache.hadoop.mapred.TestJobEndNotifier
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.122 sec - in org.apache.hadoop.mapred.TestJobEndNotifier
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestOldMethodsJobID
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.256 sec - in org.apache.hadoop.mapred.TestOldMethodsJobID
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.081 sec - in org.apache.hadoop.mapred.TestJobQueueClient
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobConf
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.468 sec - in org.apache.hadoop.mapred.TestJobConf

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 242, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.263 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:41 min
[INFO] Finished at: 2016-05-07T05:15:04+00:00
[INFO] Final Memory: 31M/190M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)




Hadoop-Mapreduce-trunk-Java8 - Build # 1373 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1373/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9520 lines...]
Running org.apache.hadoop.mapreduce.lib.chain.TestChainErrors
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.088 sec - in org.apache.hadoop.mapreduce.lib.chain.TestChainErrors
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.262 sec - in org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.154 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.535 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 530, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.205 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 27.188 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.264 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:19 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:10 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:55 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:13 h
[INFO] Finished at: 2016-05-07T03:09:24+00:00
[INFO] Final Memory: 34M/153M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter5506500744826102419.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire7642719309837069537tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_2067137685204003261669tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1372 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1372/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8433 lines...]
Running org.apache.hadoop.mapred.TestJobEndNotifier
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.056 sec - in org.apache.hadoop.mapred.TestJobEndNotifier
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestOldMethodsJobID
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.285 sec - in org.apache.hadoop.mapred.TestOldMethodsJobID
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.084 sec - in org.apache.hadoop.mapred.TestJobQueueClient
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobConf
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.481 sec - in org.apache.hadoop.mapred.TestJobConf

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 242, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.510 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:39 min
[INFO] Finished at: 2016-05-06T22:21:18+00:00
[INFO] Final Memory: 31M/192M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)




Hadoop-Mapreduce-trunk-Java8 - Build # 1371 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1371/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9500 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 76.224 sec - in org.apache.hadoop.mapreduce.v2.TestMRJobsWithHistoryService
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.TestMRAMWithNonNormalizedCapabilities
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 70.552 sec - in org.apache.hadoop.mapreduce.v2.TestMRAMWithNonNormalizedCapabilities
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestValueIterReset
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.303 sec - in org.apache.hadoop.mapreduce.TestValueIterReset
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapReduceLazyOutput
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 177.303 sec - in org.apache.hadoop.mapreduce.TestMapReduceLazyOutput

Results :

Failed tests: 
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 533, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  4.467 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:18 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 34.329 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  8.358 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [12:16 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [07:15 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:12 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:34 h
[INFO] Finished at: 2016-05-06T10:37:54+00:00
[INFO] Final Memory: 34M/183M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1370 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1370/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8430 lines...]
Running org.apache.hadoop.mapreduce.lib.input.TestLineRecordReader
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.412 sec - in org.apache.hadoop.mapreduce.lib.input.TestLineRecordReader
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.input.TestCombineFileRecordReader
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.215 sec - in org.apache.hadoop.mapreduce.lib.input.TestCombineFileRecordReader
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.partition.TestRehashPartitioner
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.205 sec - in org.apache.hadoop.mapreduce.lib.partition.TestRehashPartitioner
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestHistoryViewerPrinter
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.693 sec - in org.apache.hadoop.mapreduce.jobhistory.TestHistoryViewerPrinter

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 242, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.181 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:40 min
[INFO] Finished at: 2016-05-06T02:58:57+00:00
[INFO] Final Memory: 31M/212M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)




Hadoop-Mapreduce-trunk-Java8 - Build # 1369 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1369/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9523 lines...]
Running org.apache.hadoop.mapreduce.lib.chain.TestChainErrors
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.168 sec - in org.apache.hadoop.mapreduce.lib.chain.TestChainErrors
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.266 sec - in org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.315 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.521 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 532, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.973 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:55 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 26.726 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.949 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:29 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:51 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:42 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:03 h
[INFO] Finished at: 2016-05-06T02:52:05+00:00
[INFO] Final Memory: 34M/145M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter5014456635860418221.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire4153208382250420763tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_2343255902107551270015tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1368 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1368/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9503 lines...]
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.788 sec - in org.apache.hadoop.mapreduce.lib.chain.TestChainErrors
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.659 sec - in org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.475 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.977 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 533, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.552 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:46 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.050 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.181 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:22 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:10 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:50 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:09 h
[INFO] Finished at: 2016-05-06T00:42:14+00:00
[INFO] Final Memory: 34M/148M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1367 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1367/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9501 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 74.256 sec - in org.apache.hadoop.mapreduce.v2.TestMRJobsWithHistoryService
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.TestMRAMWithNonNormalizedCapabilities
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 74.598 sec - in org.apache.hadoop.mapreduce.v2.TestMRAMWithNonNormalizedCapabilities
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestValueIterReset
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.723 sec - in org.apache.hadoop.mapreduce.TestValueIterReset
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapReduceLazyOutput
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 197.634 sec - in org.apache.hadoop.mapreduce.TestMapReduceLazyOutput

Results :

Failed tests: 
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 533, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  4.801 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:22 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 33.602 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.970 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [12:29 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [07:46 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:05 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:28 h
[INFO] Finished at: 2016-05-05T22:03:05+00:00
[INFO] Final Memory: 34M/133M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1366 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1366/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8938 lines...]
Running org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServices
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.903 sec - in org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServices
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.hs.webapp.TestBlocks
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.221 sec - in org.apache.hadoop.mapreduce.v2.hs.webapp.TestBlocks
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.hs.webapp.TestHSWebApp
Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.374 sec - in org.apache.hadoop.mapreduce.v2.hs.webapp.TestHSWebApp
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.hs.TestCompletedTask
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.722 sec - in org.apache.hadoop.mapreduce.v2.hs.TestCompletedTask

Results :

Tests in error: 
  TestJobHistoryServer.testReports:111 » WebApp Error starting http server
  TestJobHistoryServer.testLaunch:206 » NoClassDefFound org/apache/hadoop/yarn/Y...

Tests run: 200, Failures: 0, Errors: 2, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.350 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:54 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.473 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.301 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:21 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. FAILURE [06:16 min]
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 19:10 min
[INFO] Finished at: 2016-05-05T14:39:02+00:00
[INFO] Final Memory: 37M/215M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-hs: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-hs && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-hs/target/surefire/surefirebooter5185383303350461127.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-hs/target/surefire/surefire5446741497968842582tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-hs/target/surefire/surefire_1204097501565036334484tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-hs
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryServer.testReports

Error Message:
Error starting http server

Stack Trace:
org.apache.hadoop.yarn.webapp.WebAppException: Error starting http server
	at org.apache.hadoop.http.HttpServer2.getWebAppsPath(HttpServer2.java:785)
	at org.apache.hadoop.http.HttpServer2.<init>(HttpServer2.java:341)
	at org.apache.hadoop.http.HttpServer2.<init>(HttpServer2.java:109)
	at org.apache.hadoop.http.HttpServer2$Builder.build(HttpServer2.java:291)
	at org.apache.hadoop.yarn.webapp.WebApps$Builder.build(WebApps.java:276)
	at org.apache.hadoop.yarn.webapp.WebApps$Builder.start(WebApps.java:345)
	at org.apache.hadoop.mapreduce.v2.hs.HistoryClientService.initializeWebApp(HistoryClientService.java:164)
	at org.apache.hadoop.mapreduce.v2.hs.HistoryClientService.serviceStart(HistoryClientService.java:121)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.serviceStart(JobHistoryServer.java:202)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryServer.testReports(TestJobHistoryServer.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryServer.testLaunch

Error Message:
org/apache/hadoop/yarn/YarnUncaughtExceptionHandler

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/YarnUncaughtExceptionHandler
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.launchJobHistoryServer(JobHistoryServer.java:217)
	at org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryServer.testLaunch(TestJobHistoryServer.java:206)




Hadoop-Mapreduce-trunk-Java8 - Build # 1365 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1365/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9500 lines...]
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.259 sec - in org.apache.hadoop.mapreduce.lib.chain.TestChainErrors
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.323 sec - in org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.491 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.533 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 533, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.357 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:40 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 25.479 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.550 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:14 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:10 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:46 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:04 h
[INFO] Finished at: 2016-05-05T11:05:29+00:00
[INFO] Final Memory: 34M/143M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1364 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1364/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9811 lines...]

Tests in error: 
  TestMRTimelineEventHandling.testMapreduceJobTimelineServiceEnabled:208 » IO Jo...
  TestNetworkedJob.testGetNullCounters:61 » NoClassDefFound org/apache/hadoop/co...
  TestDFSIO.beforeClass:212 » NoClassDefFound org/apache/hadoop/util/PlatformNam...
  TestSlive.testCreateOp:290->runOperationOk:346 » NoClassDefFound org/apache/ha...
  TestSlive.testMkdir:503->runOperationOk:346 » NoClassDefFound org/apache/hadoo...
  TestSlive.testSleep:447->runOperationOk:346 » NoClassDefFound org/apache/hadoo...
  TestSlive.testList:460->runOperationOk:346 » NoClassDefFound org/apache/hadoop...
  TestSlive.testRead:434->runOperationOk:346 » NoClassDefFound org/apache/hadoop...
  TestSlive.testSelection:167 » NoClassDefFound org/apache/hadoop/fs/FSDataInput...
  TestSlive.testOpFailures:297 NoClassDefFound org/apache/hadoop/fs/FSDataInputS...
  TestSlive.testDelete:381->runOperationOk:346 » NoClassDefFound org/apache/hado...
  TestSlive.testMRFlow:414 NoClassDefFound org/apache/hadoop/fs/FSDataInputStrea...
  TestSlive.testRename:402->runOperationOk:346 » NoClassDefFound org/apache/hado...
  TestSlive.testAppendOp:532->runOperationOk:346 » NoClassDefFound org/apache/ha...
  TestSlive.testTruncateOp:554->runOperationOk:346 » NoClassDefFound org/apache/...

Tests run: 365, Failures: 2, Errors: 14, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  4.392 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 38.822 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.534 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [12:37 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [07:44 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:04 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:28 h
[INFO] Finished at: 2016-05-05T08:55:29+00:00
[INFO] Final Memory: 41M/138M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter3561707715760717646.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire3993577379209557329tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_2186081885288624436023tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
16 tests failed.
FAILED:  org.apache.hadoop.fs.TestDFSIO.org.apache.hadoop.fs.TestDFSIO

Error Message:
org/apache/hadoop/util/PlatformName

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/util/PlatformName
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.security.UserGroupInformation.getOSLoginModuleName(UserGroupInformation.java:393)
	at org.apache.hadoop.security.UserGroupInformation.<clinit>(UserGroupInformation.java:438)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1097)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:395)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:228)
	at org.apache.hadoop.hdfs.MiniDFSCluster.configureNameService(MiniDFSCluster.java:1005)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:891)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:823)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:482)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:441)
	at org.apache.hadoop.fs.TestDFSIO.beforeClass(TestDFSIO.java:212)


FAILED:  org.apache.hadoop.fs.slive.TestSlive.testCreateOp

Error Message:
org/apache/hadoop/fs/DelegationTokenRenewer$Renewable

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/fs/DelegationTokenRenewer$Renewable
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:340)
	at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370)
	at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
	at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
	at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2757)
	at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2776)
	at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2797)
	at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99)
	at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2836)
	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2818)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:381)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:180)
	at org.apache.hadoop.fs.slive.TestSlive.runOperationOk(TestSlive.java:346)
	at org.apache.hadoop.fs.slive.TestSlive.testCreateOp(TestSlive.java:290)


FAILED:  org.apache.hadoop.fs.slive.TestSlive.testMkdir

Error Message:
org/apache/hadoop/fs/DelegationTokenRenewer$Renewable

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/fs/DelegationTokenRenewer$Renewable
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:340)
	at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370)
	at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
	at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
	at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2757)
	at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2776)
	at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2797)
	at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99)
	at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2836)
	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2818)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:381)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:180)
	at org.apache.hadoop.fs.slive.TestSlive.runOperationOk(TestSlive.java:346)
	at org.apache.hadoop.fs.slive.TestSlive.testMkdir(TestSlive.java:503)


FAILED:  org.apache.hadoop.fs.slive.TestSlive.testSleep

Error Message:
org/apache/hadoop/fs/DelegationTokenRenewer$Renewable

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/fs/DelegationTokenRenewer$Renewable
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:340)
	at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370)
	at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
	at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
	at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2757)
	at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2776)
	at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2797)
	at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99)
	at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2836)
	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2818)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:381)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:180)
	at org.apache.hadoop.fs.slive.TestSlive.runOperationOk(TestSlive.java:346)
	at org.apache.hadoop.fs.slive.TestSlive.testSleep(TestSlive.java:447)


FAILED:  org.apache.hadoop.fs.slive.TestSlive.testList

Error Message:
org/apache/hadoop/fs/DelegationTokenRenewer$Renewable

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/fs/DelegationTokenRenewer$Renewable
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:340)
	at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370)
	at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
	at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
	at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2757)
	at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2776)
	at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2797)
	at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99)
	at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2836)
	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2818)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:381)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:180)
	at org.apache.hadoop.fs.slive.TestSlive.runOperationOk(TestSlive.java:346)
	at org.apache.hadoop.fs.slive.TestSlive.testList(TestSlive.java:460)


FAILED:  org.apache.hadoop.fs.slive.TestSlive.testRead

Error Message:
org/apache/hadoop/fs/DelegationTokenRenewer$Renewable

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/fs/DelegationTokenRenewer$Renewable
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:340)
	at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370)
	at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
	at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
	at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2757)
	at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2776)
	at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2797)
	at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99)
	at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2836)
	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2818)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:381)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:180)
	at org.apache.hadoop.fs.slive.TestSlive.runOperationOk(TestSlive.java:346)
	at org.apache.hadoop.fs.slive.TestSlive.testRead(TestSlive.java:434)


FAILED:  org.apache.hadoop.fs.slive.TestSlive.testSelection

Error Message:
org/apache/hadoop/fs/FSDataInputStream

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.fs.slive.OperationFactory.getOperation(OperationFactory.java:58)
	at org.apache.hadoop.fs.slive.WeightSelector.configureOperations(WeightSelector.java:134)
	at org.apache.hadoop.fs.slive.WeightSelector.<init>(WeightSelector.java:66)
	at org.apache.hadoop.fs.slive.TestSlive.testSelection(TestSlive.java:167)


FAILED:  org.apache.hadoop.fs.slive.TestSlive.testOpFailures

Error Message:
org/apache/hadoop/fs/FSDataInputStream

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.fs.slive.TestSlive.testOpFailures(TestSlive.java:297)


FAILED:  org.apache.hadoop.fs.slive.TestSlive.testDelete

Error Message:
org/apache/hadoop/fs/DelegationTokenRenewer$Renewable

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/fs/DelegationTokenRenewer$Renewable
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:340)
	at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370)
	at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
	at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
	at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2757)
	at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2776)
	at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2797)
	at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99)
	at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2836)
	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2818)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:381)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:180)
	at org.apache.hadoop.fs.slive.TestSlive.runOperationOk(TestSlive.java:346)
	at org.apache.hadoop.fs.slive.TestSlive.testDelete(TestSlive.java:381)


FAILED:  org.apache.hadoop.fs.slive.TestSlive.testMRFlow

Error Message:
org/apache/hadoop/fs/FSDataInputStream

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.fs.slive.TestSlive.testMRFlow(TestSlive.java:414)


FAILED:  org.apache.hadoop.fs.slive.TestSlive.testRename

Error Message:
org/apache/hadoop/fs/DelegationTokenRenewer$Renewable

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/fs/DelegationTokenRenewer$Renewable
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:340)
	at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370)
	at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
	at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
	at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2757)
	at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2776)
	at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2797)
	at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99)
	at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2836)
	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2818)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:381)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:180)
	at org.apache.hadoop.fs.slive.TestSlive.runOperationOk(TestSlive.java:346)
	at org.apache.hadoop.fs.slive.TestSlive.testRename(TestSlive.java:402)


FAILED:  org.apache.hadoop.fs.slive.TestSlive.testAppendOp

Error Message:
org/apache/hadoop/fs/DelegationTokenRenewer$Renewable

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/fs/DelegationTokenRenewer$Renewable
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:340)
	at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370)
	at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
	at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
	at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2757)
	at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2776)
	at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2797)
	at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99)
	at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2836)
	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2818)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:381)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:180)
	at org.apache.hadoop.fs.slive.TestSlive.runOperationOk(TestSlive.java:346)
	at org.apache.hadoop.fs.slive.TestSlive.testAppendOp(TestSlive.java:532)


FAILED:  org.apache.hadoop.fs.slive.TestSlive.testTruncateOp

Error Message:
org/apache/hadoop/fs/DelegationTokenRenewer$Renewable

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/fs/DelegationTokenRenewer$Renewable
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:340)
	at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370)
	at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
	at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
	at org.apache.hadoop.fs.FileSystem.loadFileSystems(FileSystem.java:2757)
	at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2776)
	at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2797)
	at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99)
	at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2836)
	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2818)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:381)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:180)
	at org.apache.hadoop.fs.slive.TestSlive.runOperationOk(TestSlive.java:346)
	at org.apache.hadoop.fs.slive.TestSlive.testTruncateOp(TestSlive.java:554)


FAILED:  org.apache.hadoop.mapred.TestMRTimelineEventHandling.testMapreduceJobTimelineServiceEnabled

Error Message:
Job didn't finish in 30 seconds

Stack Trace:
java.io.IOException: Job didn't finish in 30 seconds
	at org.apache.hadoop.mapred.UtilsForTests.runJobSucceed(UtilsForTests.java:622)
	at org.apache.hadoop.mapred.TestMRTimelineEventHandling.testMapreduceJobTimelineServiceEnabled(TestMRTimelineEventHandling.java:208)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1363 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1363/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9537 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 24.179 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.ipc.TestMRCJCSocketFactory
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.375 sec - in org.apache.hadoop.ipc.TestMRCJCSocketFactory

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 533, Failures: 6, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.307 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.905 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.381 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:07 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:10 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:48 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:07 h
[INFO] Finished at: 2016-05-05T02:33:56+00:00
[INFO] Final Memory: 34M/161M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk-Java8 - Build # 1362 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1362/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8774 lines...]
Running org.apache.hadoop.mapreduce.v2.app.TestFetchFailure
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.674 sec - in org.apache.hadoop.mapreduce.v2.app.TestFetchFailure
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.595 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 27.864 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestEvents
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.897 sec - in org.apache.hadoop.mapreduce.jobhistory.TestEvents

Results :

Failed tests: 
  TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned:190->testTaskAttemptAssignedKilledHistory:403 No Ta Started JH Event

Tests run: 345, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.940 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:25 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 36.599 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.804 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [12:11 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 15:27 min
[INFO] Finished at: 2016-05-04T23:20:30+00:00
[INFO] Final Memory: 36M/192M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned

Error Message:
No Ta Started JH Event

Stack Trace:
java.lang.AssertionError: No Ta Started JH Event
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testTaskAttemptAssignedKilledHistory(TestTaskAttempt.java:403)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned(TestTaskAttempt.java:190)




Hadoop-Mapreduce-trunk-Java8 - Build # 1361 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1361/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8785 lines...]
Running org.apache.hadoop.mapreduce.jobhistory.TestEvents
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.676 sec - in org.apache.hadoop.mapreduce.jobhistory.TestEvents
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.367 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.359 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler

Results :

Failed tests: 
  TestJobImpl.testUnusableNodeTransition:629->assertJobState:1012 expected:<SUCCEEDED> but was:<ERROR>

Tests in error: 
  TestRecovery.testSpeculative:1201 NullPointer

Tests run: 345, Failures: 1, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.452 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:49 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 35.961 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.498 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [10:31 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13:05 min
[INFO] Finished at: 2016-05-04T20:21:19+00:00
[INFO] Final Memory: 35M/199M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.TestRecovery.testSpeculative

Error Message:
null

Stack Trace:
java.lang.NullPointerException: null
	at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testSpeculative(TestRecovery.java:1201)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition

Error Message:
expected:<SUCCEEDED> but was:<ERROR>

Stack Trace:
java.lang.AssertionError: expected:<SUCCEEDED> but was:<ERROR>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.assertJobState(TestJobImpl.java:1012)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition(TestJobImpl.java:629)




Hadoop-Mapreduce-trunk-Java8 - Build # 1360 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1360/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9534 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.274 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.696 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests run: 533, Failures: 6, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.403 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:39 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 25.126 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.248 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:04 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:10 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:47 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:05 h
[INFO] Finished at: 2016-05-04T19:24:54+00:00
[INFO] Final Memory: 34M/138M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk-Java8 - Build # 1359 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1359/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9341 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.063 sec - in org.apache.hadoop.mapred.TestInputPath
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.jobcontrol.TestJobControl
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.6 sec - in org.apache.hadoop.mapred.jobcontrol.TestJobControl
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.jobcontrol.TestLocalJobControl
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.739 sec - in org.apache.hadoop.mapred.jobcontrol.TestLocalJobControl
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestClusterMapReduceTestCase

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests run: 333, Failures: 4, Errors: 0, Skipped: 3

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.406 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.592 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.261 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:06 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:08 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:06 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:25 h
[INFO] Finished at: 2016-05-04T15:44:08+00:00
[INFO] Final Memory: 38M/136M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter6444051482587195516.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire3768191432135469352tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_1412269030364310039630tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk-Java8 - Build # 1358 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1358/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 10414 lines...]
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.783 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests in error: 
  TestDFSIO.testReadRandom:251->randomReadTest:626->runIOTest:456 »  test timed ...

Tests run: 533, Failures: 7, Errors: 1, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.317 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.941 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.349 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:10 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:12 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:49 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:08 h
[INFO] Finished at: 2016-05-04T10:08:49+00:00
[INFO] Final Memory: 34M/205M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
8 tests failed.
FAILED:  org.apache.hadoop.fs.TestDFSIO.testReadRandom

Error Message:
test timed out after 3000 milliseconds

Stack Trace:
java.lang.Exception: test timed out after 3000 milliseconds
	at java.lang.Thread.sleep(Native Method)
	at org.apache.hadoop.mapreduce.Job.monitorAndPrintJob(Job.java:1404)
	at org.apache.hadoop.mapred.JobClient$NetworkedJob.monitorAndPrintJob(JobClient.java:412)
	at org.apache.hadoop.mapred.JobClient.monitorAndPrintJob(JobClient.java:895)
	at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:875)
	at org.apache.hadoop.fs.TestDFSIO.runIOTest(TestDFSIO.java:456)
	at org.apache.hadoop.fs.TestDFSIO.randomReadTest(TestDFSIO.java:626)
	at org.apache.hadoop.fs.TestDFSIO.testReadRandom(TestDFSIO.java:251)


FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1357 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1357/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9575 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.542 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.709 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests in error: 
  TestMiniMRClasspath.testClassPath:185 » NoClassDefFound org/apache/hadoop/util...
  TestMiniMRClasspath.testExternalWritable:204 » FileNotFound webapps/hdfs not f...

Tests run: 520, Failures: 2, Errors: 2, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.208 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:39 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 25.148 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.476 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:05 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:07 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:46 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:04 h
[INFO] Finished at: 2016-05-04T07:23:52+00:00
[INFO] Final Memory: 34M/145M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter5620650508609085616.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire8355437505714695696tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_2597621990470659908204tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRClasspath.testClassPath

Error Message:
org/apache/hadoop/util/ShutdownThreadsHelper

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/util/ShutdownThreadsHelper
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.serviceStop(HistoryFileManager.java:681)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistory.serviceStop(JobHistory.java:172)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:52)
	at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:80)
	at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
	at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.serviceStop(JobHistoryServer.java:208)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster$JobHistoryServerWrapper.serviceStop(MiniMRYarnCluster.java:257)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:52)
	at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:80)
	at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
	at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapred.MiniMRYarnClusterAdapter.stop(MiniMRYarnClusterAdapter.java:55)
	at org.apache.hadoop.mapred.MiniMRCluster.shutdown(MiniMRCluster.java:267)
	at org.apache.hadoop.mapred.TestMiniMRClasspath.testClassPath(TestMiniMRClasspath.java:185)


FAILED:  org.apache.hadoop.mapred.TestMiniMRClasspath.testExternalWritable

Error Message:
webapps/hdfs not found in CLASSPATH

Stack Trace:
java.io.FileNotFoundException: webapps/hdfs not found in CLASSPATH
	at org.apache.hadoop.http.HttpServer2.getWebAppsPath(HttpServer2.java:789)
	at org.apache.hadoop.http.HttpServer2.<init>(HttpServer2.java:342)
	at org.apache.hadoop.http.HttpServer2.<init>(HttpServer2.java:110)
	at org.apache.hadoop.http.HttpServer2$Builder.build(HttpServer2.java:292)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeHttpServer.start(NameNodeHttpServer.java:142)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:862)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:705)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:924)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:903)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1620)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:1247)
	at org.apache.hadoop.hdfs.MiniDFSCluster.configureNameService(MiniDFSCluster.java:1016)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:891)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:823)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:482)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:441)
	at org.apache.hadoop.mapred.TestMiniMRClasspath.testExternalWritable(TestMiniMRClasspath.java:204)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)




Hadoop-Mapreduce-trunk-Java8 - Build # 1356 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1356/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9536 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.546 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.491 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 533, Failures: 6, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.379 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 25.049 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.380 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:13 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:10 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:46 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:04 h
[INFO] Finished at: 2016-05-04T02:23:58+00:00
[INFO] Final Memory: 34M/181M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk-Java8 - Build # 1355 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1355/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9535 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.624 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.ipc.TestMRCJCSocketFactory
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.166 sec - in org.apache.hadoop.ipc.TestMRCJCSocketFactory

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 533, Failures: 6, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.246 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:39 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 25.150 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.313 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:25 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:11 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:50 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:09 h
[INFO] Finished at: 2016-05-03T22:29:11+00:00
[INFO] Final Memory: 34M/127M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk-Java8 - Build # 1354 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1354/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8776 lines...]
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.206 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs
Tests run: 21, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.275 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.849 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapreduceConfigFields
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.789 sec - in org.apache.hadoop.mapreduce.TestMapreduceConfigFields

Results :

Failed tests: 
  TestKill.testKillJob:84 Task state not correct expected:<KILLED> but was:<NEW>

Tests run: 344, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  4.688 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:30 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 41.031 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [ 10.282 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [12:23 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 15:52 min
[INFO] Finished at: 2016-05-03T19:21:58+00:00
[INFO] Final Memory: 36M/254M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob

Error Message:
Task state not correct expected:<KILLED> but was:<NEW>

Stack Trace:
java.lang.AssertionError: Task state not correct expected:<KILLED> but was:<NEW>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob(TestKill.java:84)




Hadoop-Mapreduce-trunk-Java8 - Build # 1353 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1353/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9537 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.179 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.472 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests run: 533, Failures: 6, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.485 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 25.144 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.459 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:16 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:15 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:48 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:06 h
[INFO] Finished at: 2016-05-03T14:43:42+00:00
[INFO] Final Memory: 34M/132M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk-Java8 - Build # 1352 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1352/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9532 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.253 sec - in org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.477 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.581 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests run: 529, Failures: 6, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.234 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.939 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.331 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:13 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:09 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:53 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:12 h
[INFO] Finished at: 2016-05-03T09:19:30+00:00
[INFO] Final Memory: 34M/144M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There was a timeout or other error in the fork -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk-Java8 - Build # 1351 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1351/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9539 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.util.TestMRCJCRunJar
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.483 sec - in org.apache.hadoop.util.TestMRCJCRunJar
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.813 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests run: 533, Failures: 6, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  4.787 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 38.252 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.283 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [12:52 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [07:31 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:08 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:32 h
[INFO] Finished at: 2016-05-03T06:42:35+00:00
[INFO] Final Memory: 34M/160M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk-Java8 - Build # 1350 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1350/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9539 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestValueIterReset
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.491 sec - in org.apache.hadoop.mapreduce.TestValueIterReset
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapReduceLazyOutput
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 179.269 sec - in org.apache.hadoop.mapreduce.TestMapReduceLazyOutput

Results :

Failed tests: 
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests run: 533, Failures: 6, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.871 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:06 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 31.714 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.547 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:44 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [07:48 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:12 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:35 h
[INFO] Finished at: 2016-05-03T03:25:04+00:00
[INFO] Final Memory: 34M/155M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk-Java8 - Build # 1349 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1349/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8775 lines...]
Running org.apache.hadoop.mapreduce.v2.app.TestMRApp
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.361 sec - in org.apache.hadoop.mapreduce.v2.app.TestMRApp
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.265 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.055 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.722 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher

Results :

Failed tests: 
  TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned:190->testTaskAttemptAssignedKilledHistory:403 No Ta Started JH Event

Tests run: 344, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.291 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.573 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.253 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [10:05 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 12:14 min
[INFO] Finished at: 2016-05-02T19:55:02+00:00
[INFO] Final Memory: 36M/261M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned

Error Message:
No Ta Started JH Event

Stack Trace:
java.lang.AssertionError: No Ta Started JH Event
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testTaskAttemptAssignedKilledHistory(TestTaskAttempt.java:403)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned(TestTaskAttempt.java:190)




Hadoop-Mapreduce-trunk-Java8 - Build # 1348 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1348/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9546 lines...]
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.489 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.441 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 533, Failures: 7, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.317 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.169 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.325 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:01 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:07 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:48 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:06 h
[INFO] Finished at: 2016-04-30T22:26:36+00:00
[INFO] Final Memory: 34M/148M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
7 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk-Java8 - Build # 1347 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1347/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9545 lines...]
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.576 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.906 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests run: 533, Failures: 7, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.381 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.833 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.426 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:16 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:13 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:44 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:02 h
[INFO] Finished at: 2016-04-30T08:22:35+00:00
[INFO] Final Memory: 34M/195M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
7 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk-Java8 - Build # 1346 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1346/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9602 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.366 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>
  TestMiniMRChildTask.testTaskEnv:472->runTestTaskEnv:550 The environment checker job failed.
  TestMiniMRChildTask.testTaskOldEnv:496->runTestTaskEnv:550 The environment checker job failed.

Tests in error: 
  TestMiniMRProxyUser.__testCurrentUser:132->mrRun:124 » NoClassDefFound org/apa...
  TestMiniMRProxyUser.setUp:88 » YarnRuntime could not cleanup test dir: org.apa...

Tests run: 533, Failures: 7, Errors: 2, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.227 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:35 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.066 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.247 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:18 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:07 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:42 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:00 h
[INFO] Finished at: 2016-04-29T23:24:21+00:00
[INFO] Final Memory: 34M/195M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
9 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskEnv(TestMiniMRChildTask.java:472)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv

Error Message:
The environment checker job failed.

Stack Trace:
java.lang.AssertionError: The environment checker job failed.
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.runTestTaskEnv(TestMiniMRChildTask.java:550)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.testTaskOldEnv(TestMiniMRChildTask.java:496)


FAILED:  org.apache.hadoop.mapreduce.v2.TestMiniMRProxyUser.__testCurrentUser

Error Message:
org/apache/hadoop/io/retry/AtMostOnce

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/io/retry/AtMostOnce
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:116)
	at com.sun.proxy.$Proxy25.delete(Unknown Source)
	at org.apache.hadoop.hdfs.DFSClient.delete(DFSClient.java:1572)
	at org.apache.hadoop.hdfs.DistributedFileSystem$16.doCall(DistributedFileSystem.java:725)
	at org.apache.hadoop.hdfs.DistributedFileSystem$16.doCall(DistributedFileSystem.java:722)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.delete(DistributedFileSystem.java:732)
	at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:255)
	at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1341)
	at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1338)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1753)
	at org.apache.hadoop.mapreduce.Job.submit(Job.java:1338)
	at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:576)
	at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:571)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1753)
	at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:571)
	at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:562)
	at org.apache.hadoop.mapreduce.v2.TestMiniMRProxyUser.mrRun(TestMiniMRProxyUser.java:124)
	at org.apache.hadoop.mapreduce.v2.TestMiniMRProxyUser.__testCurrentUser(TestMiniMRProxyUser.java:132)


FAILED:  org.apache.hadoop.mapreduce.v2.TestMiniMRProxyUser.testValidProxyUser

Error Message:
could not cleanup test dir: org.apache.hadoop.fs.UnsupportedFileSystemException: fs.AbstractFileSystem.file.impl=null: No AbstractFileSystem configured for scheme: file

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: could not cleanup test dir: org.apache.hadoop.fs.UnsupportedFileSystemException: fs.AbstractFileSystem.file.impl=null: No AbstractFileSystem configured for scheme: file
	at org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:161)
	at org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:250)
	at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:332)
	at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:329)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1753)
	at org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:329)
	at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:446)
	at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:423)
	at org.apache.hadoop.fs.FileContext.getLocalFSFileContext(FileContext.java:409)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.<init>(MiniYARNCluster.java:161)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.<init>(MiniMRYarnCluster.java:79)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.<init>(MiniMRYarnCluster.java:75)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:73)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapreduce.v2.TestMiniMRProxyUser.setUp(TestMiniMRProxyUser.java:88)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk-Java8 - Build # 1345 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1345/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8772 lines...]
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.151 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.044 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.264 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.668 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher

Results :

Failed tests: 
  TestJobImpl.testUnusableNodeTransition:629->assertJobState:1012 expected:<SUCCEEDED> but was:<ERROR>

Tests run: 344, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.150 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.091 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.281 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [10:13 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 12:20 min
[INFO] Finished at: 2016-04-29T20:02:59+00:00
[INFO] Final Memory: 36M/179M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition

Error Message:
expected:<SUCCEEDED> but was:<ERROR>

Stack Trace:
java.lang.AssertionError: expected:<SUCCEEDED> but was:<ERROR>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.assertJobState(TestJobImpl.java:1012)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition(TestJobImpl.java:629)




Hadoop-Mapreduce-trunk-Java8 - Build # 1344 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1344/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9682 lines...]

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests in error: 
  TestMRCJCFileInputFormat.testLocality:58->newDFSCluster:52 » NoClassDefFound o...
  TestMRCJCFileInputFormat.testMultiLevelInput:169 » NoClassDefFound org/apache/...
  TestMRCJCFileInputFormat.testNumInputs:118->newDFSCluster:52 » NoClassDefFound
  TestMiniMRWithDFSWithDistinctUsers.testMultipleSpills:147 » Remote org/apache/...
  TestMiniMRWithDFSWithDistinctUsers.tearDown:104 » NoClassDefFound org/apache/h...
  TestMiniMRChildTask.setup:356 » NoClassDefFound org/apache/hadoop/yarn/event/E...

Tests run: 527, Failures: 5, Errors: 4, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.224 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.428 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.275 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:10 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:07 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:37 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:55 h
[INFO] Finished at: 2016-04-29T19:15:38+00:00
[INFO] Final Memory: 34M/185M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter7395651280021505883.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire3797684693181782309tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_2971151019236964556290tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
9 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileInputFormat.testLocality

Error Message:
org/apache/hadoop/security/authentication/server/AuthenticationFilter

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/security/authentication/server/AuthenticationFilter
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.http.HttpServer2.constructSecretProvider(HttpServer2.java:455)
	at org.apache.hadoop.http.HttpServer2.<init>(HttpServer2.java:348)
	at org.apache.hadoop.http.HttpServer2.<init>(HttpServer2.java:110)
	at org.apache.hadoop.http.HttpServer2$Builder.build(HttpServer2.java:292)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeHttpServer.start(NameNodeHttpServer.java:142)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:862)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:705)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:924)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:903)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1620)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:1247)
	at org.apache.hadoop.hdfs.MiniDFSCluster.configureNameService(MiniDFSCluster.java:1016)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:891)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:823)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:482)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:441)
	at org.apache.hadoop.mapred.TestMRCJCFileInputFormat.newDFSCluster(TestMRCJCFileInputFormat.java:52)
	at org.apache.hadoop.mapred.TestMRCJCFileInputFormat.testLocality(TestMRCJCFileInputFormat.java:58)


FAILED:  org.apache.hadoop.mapred.TestMRCJCFileInputFormat.testMultiLevelInput

Error Message:
org/apache/hadoop/security/authentication/server/AuthenticationFilter

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/security/authentication/server/AuthenticationFilter
	at org.apache.hadoop.http.HttpServer2.constructSecretProvider(HttpServer2.java:455)
	at org.apache.hadoop.http.HttpServer2.<init>(HttpServer2.java:348)
	at org.apache.hadoop.http.HttpServer2.<init>(HttpServer2.java:110)
	at org.apache.hadoop.http.HttpServer2$Builder.build(HttpServer2.java:292)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeHttpServer.start(NameNodeHttpServer.java:142)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:862)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:705)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:924)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:903)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1620)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:1247)
	at org.apache.hadoop.hdfs.MiniDFSCluster.configureNameService(MiniDFSCluster.java:1016)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:891)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:823)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:482)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:441)
	at org.apache.hadoop.mapred.TestMRCJCFileInputFormat.testMultiLevelInput(TestMRCJCFileInputFormat.java:169)


FAILED:  org.apache.hadoop.mapred.TestMRCJCFileInputFormat.testNumInputs

Error Message:
org/apache/hadoop/security/authentication/server/AuthenticationFilter

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/security/authentication/server/AuthenticationFilter
	at org.apache.hadoop.http.HttpServer2.constructSecretProvider(HttpServer2.java:455)
	at org.apache.hadoop.http.HttpServer2.<init>(HttpServer2.java:348)
	at org.apache.hadoop.http.HttpServer2.<init>(HttpServer2.java:110)
	at org.apache.hadoop.http.HttpServer2$Builder.build(HttpServer2.java:292)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeHttpServer.start(NameNodeHttpServer.java:142)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:862)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:705)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:924)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:903)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1620)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:1247)
	at org.apache.hadoop.hdfs.MiniDFSCluster.configureNameService(MiniDFSCluster.java:1016)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:891)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:823)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:482)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:441)
	at org.apache.hadoop.mapred.TestMRCJCFileInputFormat.newDFSCluster(TestMRCJCFileInputFormat.java:52)
	at org.apache.hadoop.mapred.TestMRCJCFileInputFormat.testNumInputs(TestMRCJCFileInputFormat.java:118)


FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.org.apache.hadoop.mapred.TestMiniMRChildTask

Error Message:
org/apache/hadoop/yarn/event/EventHandler

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/event/EventHandler
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.setup(TestMiniMRChildTask.java:356)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk-Java8 - Build # 1343 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1343/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8772 lines...]
Running org.apache.hadoop.mapreduce.TestMapreduceConfigFields
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.635 sec - in org.apache.hadoop.mapreduce.TestMapreduceConfigFields
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.019 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.682 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.322 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor

Results :

Failed tests: 
  TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned:190->testTaskAttemptAssignedKilledHistory:403 No Ta Started JH Event

Tests run: 344, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.327 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.523 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.382 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [10:10 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 12:20 min
[INFO] Finished at: 2016-04-29T15:32:09+00:00
[INFO] Final Memory: 36M/198M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned

Error Message:
No Ta Started JH Event

Stack Trace:
java.lang.AssertionError: No Ta Started JH Event
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testTaskAttemptAssignedKilledHistory(TestTaskAttempt.java:403)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned(TestTaskAttempt.java:190)




Hadoop-Mapreduce-trunk-Java8 - Build # 1342 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1342/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8888 lines...]
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 27.133 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs
Tests run: 21, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.903 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.179 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapreduceConfigFields
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.795 sec - in org.apache.hadoop.mapreduce.TestMapreduceConfigFields

Results :

Tests in error: 
  TestContainerLauncher.testSlowNM:292 »  test timed out after 15000 millisecond...

Tests run: 344, Failures: 0, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.461 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:03 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 31.243 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.497 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [15:10 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 17:57 min
[INFO] Finished at: 2016-04-29T12:38:32+00:00
[INFO] Final Memory: 35M/202M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.launcher.TestContainerLauncher.testSlowNM

Error Message:
test timed out after 15000 milliseconds

Stack Trace:
java.lang.Exception: test timed out after 15000 milliseconds
	at java.lang.Thread.sleep(Native Method)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.waitForState(MRApp.java:413)
	at org.apache.hadoop.mapreduce.v2.app.launcher.TestContainerLauncher.testSlowNM(TestContainerLauncher.java:292)




Hadoop-Mapreduce-trunk-Java8 - Build # 1341 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1341/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8429 lines...]
Running org.apache.hadoop.mapreduce.filecache.TestClientDistributedCacheManager
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.239 sec - in org.apache.hadoop.mapreduce.filecache.TestClientDistributedCacheManager
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestShufflePlugin
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.923 sec - in org.apache.hadoop.mapreduce.TestShufflePlugin
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestContextFactory
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.896 sec - in org.apache.hadoop.mapreduce.TestContextFactory
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestHistoryViewerPrinter
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.681 sec - in org.apache.hadoop.mapreduce.jobhistory.TestHistoryViewerPrinter

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 242, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.366 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:40 min
[INFO] Finished at: 2016-04-29T11:31:20+00:00
[INFO] Final Memory: 32M/186M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)




Hadoop-Mapreduce-trunk-Java8 - Build # 1340 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1340/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9519 lines...]
Running org.apache.hadoop.mapred.TestJavaSerialization
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.054 sec - in org.apache.hadoop.mapred.TestJavaSerialization
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.771 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.ipc.TestMRCJCSocketFactory
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.22 sec - in org.apache.hadoop.ipc.TestMRCJCSocketFactory

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.419 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:39 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.559 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.398 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:13 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:10 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:46 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:05 h
[INFO] Finished at: 2016-04-29T08:04:08+00:00
[INFO] Final Memory: 34M/153M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)




Hadoop-Mapreduce-trunk-Java8 - Build # 1339 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1339/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9526 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.57 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.554 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.574 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.381 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.119 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.531 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:12 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:10 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:42 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:01 h
[INFO] Finished at: 2016-04-28T21:03:54+00:00
[INFO] Final Memory: 34M/161M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1338 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1338/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9356 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.jobcontrol.TestJobControl
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.328 sec - in org.apache.hadoop.mapred.jobcontrol.TestJobControl
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.jobcontrol.TestLocalJobControl
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.628 sec - in org.apache.hadoop.mapred.jobcontrol.TestLocalJobControl
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestClusterMapReduceTestCase

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests in error: 
  TestFixedLengthInputFormat.onlyOnce:68 » NoClassDefFound org/apache/hadoop/uti...

Tests run: 327, Failures: 4, Errors: 1, Skipped: 3

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.323 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.266 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.376 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:03 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:08 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:07 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:25 h
[INFO] Finished at: 2016-04-28T18:45:41+00:00
[INFO] Final Memory: 39M/149M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: org.apache.maven.surefire.booter.SurefireBooterForkException: Error occurred in starting fork, check output in log -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapreduce.lib.input.TestFixedLengthInputFormat.org.apache.hadoop.mapreduce.lib.input.TestFixedLengthInputFormat

Error Message:
org/apache/hadoop/util/PlatformName

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/util/PlatformName
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.security.UserGroupInformation.getOSLoginModuleName(UserGroupInformation.java:393)
	at org.apache.hadoop.security.UserGroupInformation.<clinit>(UserGroupInformation.java:438)
	at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2964)
	at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2954)
	at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2817)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:381)
	at org.apache.hadoop.fs.FileSystem.getLocal(FileSystem.java:352)
	at org.apache.hadoop.mapreduce.lib.input.TestFixedLengthInputFormat.onlyOnce(TestFixedLengthInputFormat.java:68)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1337 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1337/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9572 lines...]
Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 70.763 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 24.271 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.523 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestMRIntermediateDataEncryption.testSingleReducer:55->doEncryptionTest:75->doEncryptionTest:95->runMergeTest:161->verifyOutput:200 expected:<3000> but was:<0>

Tests run: 528, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.401 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.626 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.413 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:10 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:09 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:45 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:04 h
[INFO] Finished at: 2016-04-28T01:52:47+00:00
[INFO] Final Memory: 34M/138M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter7424463850586390641.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire1925641599219304149tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_280640558872126608594tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRIntermediateDataEncryption.testSingleReducer

Error Message:
expected:<3000> but was:<0>

Stack Trace:
java.lang.AssertionError: expected:<3000> but was:<0>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.junit.Assert.assertEquals(Assert.java:542)
	at org.apache.hadoop.mapred.TestMRIntermediateDataEncryption.verifyOutput(TestMRIntermediateDataEncryption.java:200)
	at org.apache.hadoop.mapred.TestMRIntermediateDataEncryption.runMergeTest(TestMRIntermediateDataEncryption.java:161)
	at org.apache.hadoop.mapred.TestMRIntermediateDataEncryption.doEncryptionTest(TestMRIntermediateDataEncryption.java:95)
	at org.apache.hadoop.mapred.TestMRIntermediateDataEncryption.doEncryptionTest(TestMRIntermediateDataEncryption.java:75)
	at org.apache.hadoop.mapred.TestMRIntermediateDataEncryption.testSingleReducer(TestMRIntermediateDataEncryption.java:55)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1336 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1336/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 4035 lines...]
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.771 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.377 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestEvents
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.596 sec - in org.apache.hadoop.mapreduce.jobhistory.TestEvents

Results :

Failed tests: 
  TestJobImpl.testUnusableNodeTransition:629->assertJobState:1012 expected:<SUCCEEDED> but was:<ERROR>
  TestTaskAttempt.testMillisCountersUpdate:266->verifyMillisCounters:309 Job state is not correct (timedout) expected:<SUCCEEDED> but was:<RUNNING>

Tests in error: 
  TestTaskAttempt.testMRAppHistoryForMap:120 » YarnRuntime could not cleanup tes...

Tests run: 344, Failures: 2, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.346 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:45 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 27.851 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.654 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [10:49 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13:10 min
[INFO] Finished at: 2016-04-27T22:45:54+00:00
[INFO] Final Memory: 57M/296M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition

Error Message:
expected:<SUCCEEDED> but was:<ERROR>

Stack Trace:
java.lang.AssertionError: expected:<SUCCEEDED> but was:<ERROR>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.assertJobState(TestJobImpl.java:1012)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition(TestJobImpl.java:629)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMillisCountersUpdate

Error Message:
Job state is not correct (timedout) expected:<SUCCEEDED> but was:<RUNNING>

Stack Trace:
java.lang.AssertionError: Job state is not correct (timedout) expected:<SUCCEEDED> but was:<RUNNING>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.waitForState(MRApp.java:416)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.verifyMillisCounters(TestTaskAttempt.java:309)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMillisCountersUpdate(TestTaskAttempt.java:266)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForMap

Error Message:
could not cleanup test dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: could not cleanup test dir
	at org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:161)
	at org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:250)
	at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:332)
	at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:329)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1753)
	at org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:329)
	at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:446)
	at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:423)
	at org.apache.hadoop.fs.FileContext.getLocalFSFileContext(FileContext.java:409)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:243)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:212)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:193)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:154)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt$FailingAttemptsMRApp.<init>(TestTaskAttempt.java:409)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForMap(TestTaskAttempt.java:120)



Hadoop-Mapreduce-trunk-Java8 - Build # 1335 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1335/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9515 lines...]
Running org.apache.hadoop.mapreduce.lib.chain.TestChainErrors
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.009 sec - in org.apache.hadoop.mapreduce.lib.chain.TestChainErrors
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.135 sec - in org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.175 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.502 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests run: 529, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.401 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 25.039 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.349 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:13 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:10 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:52 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:11 h
[INFO] Finished at: 2016-04-27T21:16:45+00:00
[INFO] Final Memory: 34M/147M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There was a timeout or other error in the fork -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1334 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1334/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9527 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.554 sec - in org.apache.hadoop.util.TestMRCJCReflectionUtils
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.util.TestMRCJCRunJar
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.26 sec - in org.apache.hadoop.util.TestMRCJCRunJar
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.682 sec - in org.apache.hadoop.hdfs.TestNNBench

Results :

Failed tests: 
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests run: 533, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.475 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 25.085 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.727 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:29 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:15 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:44 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:03 h
[INFO] Finished at: 2016-04-27T18:54:05+00:00
[INFO] Final Memory: 34M/152M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1333 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1333/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9584 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.144 sec - in org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.235 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.577 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests run: 530, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.293 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.661 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.428 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:02 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:09 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:59 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:19 h
[INFO] Finished at: 2016-04-27T16:39:34+00:00
[INFO] Final Memory: 34M/151M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter4058041901220801928.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire7864393032238411291tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_2677803789057696711712tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1332 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1332/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8816 lines...]
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.388 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs
Tests run: 21, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.712 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.904 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapreduceConfigFields
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.809 sec - in org.apache.hadoop.mapreduce.TestMapreduceConfigFields

Results :

Tests in error: 
  TestMRAppComponentDependencies.testComponentStopOrder:52 »  test timed out aft...

Tests run: 344, Failures: 0, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  9.460 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [04:09 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 34.346 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  8.960 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [16:18 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 21:24 min
[INFO] Finished at: 2016-04-27T11:44:35+00:00
[INFO] Final Memory: 36M/189M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.TestMRAppComponentDependencies.testComponentStopOrder

Error Message:
test timed out after 20000 milliseconds

Stack Trace:
java.lang.Exception: test timed out after 20000 milliseconds
	at java.lang.Thread.sleep(Native Method)
	at org.apache.hadoop.mapreduce.v2.app.TestMRAppComponentDependencies.testComponentStopOrder(TestMRAppComponentDependencies.java:52)



Hadoop-Mapreduce-trunk-Java8 - Build # 1331 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1331/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 15124 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.677 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.636 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.866 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 520, Failures: 1, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.439 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:39 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.187 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.311 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:07 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:10 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:46 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:05 h
[INFO] Finished at: 2016-04-27T06:44:19+00:00
[INFO] Final Memory: 34M/158M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter4577813972460915431.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire1214866582285806452tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_2037775318790145158044tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)



Hadoop-Mapreduce-trunk-Java8 - Build # 1330 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1330/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9574 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.713 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.566 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.322 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.208 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 26.730 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.461 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:00 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:11 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:44 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:03 h
[INFO] Finished at: 2016-04-27T04:32:55+00:00
[INFO] Final Memory: 34M/160M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1329 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1329/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9573 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.597 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.578 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.402 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.194 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:35 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.496 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.248 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:58 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:06 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:42 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:01 h
[INFO] Finished at: 2016-04-27T02:23:49+00:00
[INFO] Final Memory: 34M/156M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1328 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1328/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8819 lines...]
Running org.apache.hadoop.mapreduce.v2.app.TestFetchFailure
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.89 sec - in org.apache.hadoop.mapreduce.v2.app.TestFetchFailure
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.599 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 29.101 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestEvents
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.991 sec - in org.apache.hadoop.mapreduce.jobhistory.TestEvents

Results :

Failed tests: 
  TestRecovery.testCrashed:188 TaskAttempt state is not correct (timedout) expected:<FAILED> but was:<STARTING>

Tests run: 344, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.777 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:29 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 37.706 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  6.942 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [13:43 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 17:02 min
[INFO] Finished at: 2016-04-26T21:14:59+00:00
[INFO] Final Memory: 35M/202M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.TestRecovery.testCrashed

Error Message:
TaskAttempt state is not correct (timedout) expected:<FAILED> but was:<STARTING>

Stack Trace:
java.lang.AssertionError: TaskAttempt state is not correct (timedout) expected:<FAILED> but was:<STARTING>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.waitForState(MRApp.java:382)
	at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testCrashed(TestRecovery.java:188)



Hadoop-Mapreduce-trunk-Java8 - Build # 1327 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1327/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9573 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.52 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.534 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.116 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.224 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.087 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.368 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:59 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:08 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:50 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:10 h
[INFO] Finished at: 2016-04-26T19:29:50+00:00
[INFO] Final Memory: 34M/153M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1326 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1326/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9623 lines...]
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.903 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.602 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests in error: 
  TestJobSysDirWithDFS.testWithDFS:137 » NoClassDefFound org/apache/hadoop/util/...

Tests run: 532, Failures: 5, Errors: 1, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.277 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.012 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.231 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:05 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:10 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:00 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:19 h
[INFO] Finished at: 2016-04-26T17:07:23+00:00
[INFO] Final Memory: 37M/152M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: org.apache.maven.surefire.booter.SurefireBooterForkException: Error occurred in starting fork, check output in log -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapred.TestJobSysDirWithDFS.testWithDFS

Error Message:
org/apache/hadoop/util/ShutdownThreadsHelper

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/util/ShutdownThreadsHelper
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.serviceStop(HistoryFileManager.java:681)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistory.serviceStop(JobHistory.java:172)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:52)
	at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:80)
	at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
	at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.serviceStop(JobHistoryServer.java:208)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster$JobHistoryServerWrapper.serviceStop(MiniMRYarnCluster.java:257)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:52)
	at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:80)
	at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
	at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapred.MiniMRYarnClusterAdapter.stop(MiniMRYarnClusterAdapter.java:55)
	at org.apache.hadoop.mapred.MiniMRCluster.shutdown(MiniMRCluster.java:267)
	at org.apache.hadoop.mapred.TestJobSysDirWithDFS.testWithDFS(TestJobSysDirWithDFS.java:137)


FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1325 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1325/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 26532 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.452 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests in error: 
  TestMRJobs.testSleepJobWithRemoteJar:198->testSleepJobInternal:234 »  test tim...
  TestMRJobs.testConfVerificationWithJobClient:326->testConfVerification:411 »  ...
  TestMRJobs.testFailingMapper:568->runFailingMapperJob:630 »  test timed out af...
  TestMRJobs.testConfVerificationWithClassloaderCustomClasses:316->testConfVerification:365 » NoClassDefFound
  TestClusterMRNotification>NotificationTestCase.setUp:152->HadoopTestCase.setUp:156 » YarnRuntime
  TestJobCounters.testHeapUsageCounter:670 » NoClassDefFound org/apache/hadoop/y...

Tests run: 513, Failures: 4, Errors: 2, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.254 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:42 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.317 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.745 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:19 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:11 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:02 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:21 h
[INFO] Finished at: 2016-04-26T14:41:52+00:00
[INFO] Final Memory: 39M/156M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter170615521598053974.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire3100316962287388375tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_2428282644596297093120tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapred.TestClusterMRNotification.testMR

Error Message:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Failed to intialize existing directories

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Failed to intialize existing directories
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster$JobHistoryServerWrapper.serviceStart(MiniMRYarnCluster.java:250)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.serviceStart(MiniYARNCluster.java:292)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.serviceStart(MiniMRYarnCluster.java:191)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:80)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:159)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:152)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:145)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:138)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:133)
	at org.apache.hadoop.mapred.HadoopTestCase.setUp(HadoopTestCase.java:156)
	at org.apache.hadoop.mapred.NotificationTestCase.setUp(NotificationTestCase.java:152)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Failed to intialize existing directories
	at org.apache.hadoop.mapreduce.v2.hs.JobHistory.serviceInit(JobHistory.java:100)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:107)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.serviceInit(JobHistoryServer.java:152)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster$JobHistoryServerWrapper.serviceStart(MiniMRYarnCluster.java:232)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.serviceStart(MiniYARNCluster.java:292)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.serviceStart(MiniMRYarnCluster.java:191)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:80)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:159)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:152)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:145)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:138)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:133)
	at org.apache.hadoop.mapred.HadoopTestCase.setUp(HadoopTestCase.java:156)
	at org.apache.hadoop.mapred.NotificationTestCase.setUp(NotificationTestCase.java:152)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: java.io.FileNotFoundException: File file:/tmp/hadoop-yarn/staging/history/done does not exist
	at org.apache.hadoop.fs.RawLocalFileSystem.listStatus(RawLocalFileSystem.java:469)
	at org.apache.hadoop.fs.DelegateToFileSystem.listStatus(DelegateToFileSystem.java:168)
	at org.apache.hadoop.fs.ChecksumFs.listStatus(ChecksumFs.java:519)
	at org.apache.hadoop.fs.AbstractFileSystem$1.<init>(AbstractFileSystem.java:890)
	at org.apache.hadoop.fs.AbstractFileSystem.listStatusIterator(AbstractFileSystem.java:888)
	at org.apache.hadoop.fs.FileContext$22.next(FileContext.java:1492)
	at org.apache.hadoop.fs.FileContext$22.next(FileContext.java:1487)
	at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90)
	at org.apache.hadoop.fs.FileContext.listStatus(FileContext.java:1494)
	at org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:457)
	at org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:444)
	at org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:439)
	at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.findTimestampedDirectories(HistoryFileManager.java:824)
	at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.initExisting(HistoryFileManager.java:718)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistory.serviceInit(JobHistory.java:98)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:107)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.serviceInit(JobHistoryServer.java:152)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster$JobHistoryServerWrapper.serviceStart(MiniMRYarnCluster.java:232)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.serviceStart(MiniYARNCluster.java:292)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.serviceStart(MiniMRYarnCluster.java:191)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:80)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:159)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:152)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:145)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:138)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:133)
	at org.apache.hadoop.mapred.HadoopTestCase.setUp(HadoopTestCase.java:156)
	at org.apache.hadoop.mapred.NotificationTestCase.setUp(NotificationTestCase.java:152)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.mapred.TestJobCounters.testHeapUsageCounter

Error Message:
org/apache/hadoop/yarn/client/api/YarnClient

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/client/api/YarnClient
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.mapred.YARNRunner.<init>(YARNRunner.java:122)
	at org.apache.hadoop.mapred.YarnClientProtocolProvider.create(YarnClientProtocolProvider.java:34)
	at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:111)
	at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:98)
	at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:91)
	at org.apache.hadoop.mapred.JobClient.init(JobClient.java:475)
	at org.apache.hadoop.mapred.JobClient.<init>(JobClient.java:454)
	at org.apache.hadoop.mapred.TestJobCounters.testHeapUsageCounter(TestJobCounters.java:670)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1324 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1324/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9573 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.687 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.543 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.072 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.300 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:42 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 25.138 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.257 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:02 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:10 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:47 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:07 h
[INFO] Finished at: 2016-04-26T09:42:53+00:00
[INFO] Final Memory: 34M/155M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1323 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1323/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9574 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.623 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.562 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.246 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.278 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.262 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.355 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:54 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:10 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:50 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:10 h
[INFO] Finished at: 2016-04-26T07:30:01+00:00
[INFO] Final Memory: 34M/156M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1322 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1322/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9397 lines...]
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServices.testInvalidUri2:227 NoClassDefFound org/apache/hadoop/yarn/w...
  TestAMWebServices.testInvalidAccept:244 NoClassDefFound org/apache/hadoop/yarn...

Tests run: 344, Failures: 11, Errors: 22, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.359 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.656 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.380 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [10:58 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13:10 min
[INFO] Finished at: 2016-04-25T22:33:20+00:00
[INFO] Final Memory: 36M/182M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
33 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInvalidUri2

Error Message:
org/apache/hadoop/yarn/webapp/WebServicesTestUtils

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/webapp/WebServicesTestUtils
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInvalidUri2(TestAMWebServices.java:227)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testAMXML

Error Message:
expected:<application/xml> but was:<text/html; charset=ISO-8859-1>

Stack Trace:
java.lang.AssertionError: expected:<application/xml> but was:<text/html; charset=ISO-8859-1>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testAMXML(TestAMWebServices.java:149)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInfo

Error Message:
expected:<application/json> but was:<text/html; charset=ISO-8859-1>

Stack Trace:
java.lang.AssertionError: expected:<application/json> but was:<text/html; charset=ISO-8859-1>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInfo(TestAMWebServices.java:160)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInfoDefault

Error Message:
expected:<application/json> but was:<text/html; charset=ISO-8859-1>

Stack Trace:
java.lang.AssertionError: expected:<application/json> but was:<text/html; charset=ISO-8859-1>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInfoDefault(TestAMWebServices.java:183)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testAM

Error Message:
expected:<application/json> but was:<text/html; charset=ISO-8859-1>

Stack Trace:
java.lang.AssertionError: expected:<application/json> but was:<text/html; charset=ISO-8859-1>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testAM(TestAMWebServices.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInvalidAccept

Error Message:
org/apache/hadoop/yarn/webapp/WebServicesTestUtils

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/webapp/WebServicesTestUtils
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInvalidAccept(TestAMWebServices.java:244)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testBlacklistedNodesXML

Error Message:
expected:<application/xml> but was:<text/html; charset=ISO-8859-1>

Stack Trace:
java.lang.AssertionError: expected:<application/xml> but was:<text/html; charset=ISO-8859-1>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testBlacklistedNodesXML(TestAMWebServices.java:267)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testBlacklistedNodes

Error Message:
expected:<application/json> but was:<text/html; charset=ISO-8859-1>

Stack Trace:
java.lang.AssertionError: expected:<application/json> but was:<text/html; charset=ISO-8859-1>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testBlacklistedNodes(TestAMWebServices.java:255)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testAMDefault

Error Message:
expected:<application/json> but was:<text/html; charset=ISO-8859-1>

Stack Trace:
java.lang.AssertionError: expected:<application/json> but was:<text/html; charset=ISO-8859-1>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testAMDefault(TestAMWebServices.java:138)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInfoXML

Error Message:
expected:<application/xml> but was:<text/html; charset=ISO-8859-1>

Stack Trace:
java.lang.AssertionError: expected:<application/xml> but was:<text/html; charset=ISO-8859-1>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInfoXML(TestAMWebServices.java:195)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInvalidUri

Error Message:
expected:<Not Found> but was:<Internal Server Error>

Stack Trace:
java.lang.AssertionError: expected:<Not Found> but was:<Internal Server Error>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInvalidUri(TestAMWebServices.java:210)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInfoSlash

Error Message:
expected:<application/json> but was:<text/html; charset=ISO-8859-1>

Stack Trace:
java.lang.AssertionError: expected:<application/json> but was:<text/html; charset=ISO-8859-1>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInfoSlash(TestAMWebServices.java:172)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testAMSlash

Error Message:
expected:<application/json> but was:<text/html; charset=ISO-8859-1>

Stack Trace:
java.lang.AssertionError: expected:<application/json> but was:<text/html; charset=ISO-8859-1>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testAMSlash(TestAMWebServices.java:127)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdCounters

Error Message:
org/apache/hadoop/yarn/api/records/ContainerId

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/api/records/ContainerId
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.mapreduce.v2.app.MockJobs.newTaskAttemptReport(MockJobs.java:187)
	at org.apache.hadoop.mapreduce.v2.app.MockJobs.newTaskAttempt(MockJobs.java:248)
	at org.apache.hadoop.mapreduce.v2.app.MockJobs.newTaskAttempts(MockJobs.java:238)
	at org.apache.hadoop.mapreduce.v2.app.MockJobs.newTask(MockJobs.java:370)
	at org.apache.hadoop.mapreduce.v2.app.MockJobs.newTasks(MockJobs.java:358)
	at org.apache.hadoop.mapreduce.v2.app.MockJobs.newJob(MockJobs.java:490)
	at org.apache.hadoop.mapreduce.v2.app.MockJobs.newJobs(MockJobs.java:142)
	at org.apache.hadoop.mapreduce.v2.app.MockAppContext.<init>(MockAppContext.java:67)
	at org.apache.hadoop.mapreduce.v2.app.MockAppContext.<init>(MockAppContext.java:60)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks$1.configureServlets(TestAMWebServicesTasks.java:84)
	at com.google.inject.servlet.ServletModule.configure(ServletModule.java:53)
	at com.google.inject.AbstractModule.configure(AbstractModule.java:59)
	at com.google.inject.spi.Elements$RecordingBinder.install(Elements.java:223)
	at com.google.inject.spi.Elements.getElements(Elements.java:101)
	at com.google.inject.internal.InjectorShell$Builder.build(InjectorShell.java:133)
	at com.google.inject.internal.InternalInjectorCreator.build(InternalInjectorCreator.java:103)
	at com.google.inject.Guice.createInjector(Guice.java:95)
	at com.google.inject.Guice.createInjector(Guice.java:72)
	at com.google.inject.Guice.createInjector(Guice.java:62)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:80)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdBogus

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testJobTaskCountersXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksQueryReduce

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasks

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksQueryMap

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdInvalid2

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdInvalid3

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdCountersSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdNonExist

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdInvalid

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdCountersDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksQueryInvalid

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskId

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)



Hadoop-Mapreduce-trunk-Java8 - Build # 1321 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1321/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8475 lines...]
Running org.apache.hadoop.mapreduce.filecache.TestClientDistributedCacheManager
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.245 sec - in org.apache.hadoop.mapreduce.filecache.TestClientDistributedCacheManager
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestShufflePlugin
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.894 sec - in org.apache.hadoop.mapreduce.TestShufflePlugin
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestContextFactory
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.835 sec - in org.apache.hadoop.mapreduce.TestContextFactory
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestHistoryViewerPrinter
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.71 sec - in org.apache.hadoop.mapreduce.jobhistory.TestHistoryViewerPrinter

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 242, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.389 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:41 min
[INFO] Finished at: 2016-04-25T20:01:18+00:00
[INFO] Final Memory: 32M/186M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk-Java8 - Build # 1320 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1320/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9564 lines...]
Running org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.125 sec - in org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.746 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.637 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.471 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:43 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 26.767 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.864 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:33 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:22 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:46 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:07 h
[INFO] Finished at: 2016-04-25T19:27:23+00:00
[INFO] Final Memory: 34M/144M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1319 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1319/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8473 lines...]
Running org.apache.hadoop.mapred.TestJobEndNotifier
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.06 sec - in org.apache.hadoop.mapred.TestJobEndNotifier
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestOldMethodsJobID
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.286 sec - in org.apache.hadoop.mapred.TestOldMethodsJobID
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.089 sec - in org.apache.hadoop.mapred.TestJobQueueClient
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobConf
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.495 sec - in org.apache.hadoop.mapred.TestJobConf

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 242, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.322 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:41 min
[INFO] Finished at: 2016-04-25T14:21:15+00:00
[INFO] Final Memory: 32M/178M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk-Java8 - Build # 1318 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1318/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8475 lines...]
Running org.apache.hadoop.mapred.TestJobEndNotifier
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.198 sec - in org.apache.hadoop.mapred.TestJobEndNotifier
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestOldMethodsJobID
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.251 sec - in org.apache.hadoop.mapred.TestOldMethodsJobID
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.116 sec - in org.apache.hadoop.mapred.TestJobQueueClient
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobConf
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.554 sec - in org.apache.hadoop.mapred.TestJobConf

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 242, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.369 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:41 min
[INFO] Finished at: 2016-04-25T06:21:50+00:00
[INFO] Final Memory: 32M/179M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk-Java8 - Build # 1317 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1317/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9573 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.606 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.533 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.808 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.269 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.212 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.282 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:58 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:08 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:49 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:08 h
[INFO] Finished at: 2016-04-23T17:27:50+00:00
[INFO] Final Memory: 34M/159M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1316 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1316/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9574 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.529 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.53 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.255 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.161 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 23.806 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.281 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:08 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:10 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:50 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:09 h
[INFO] Finished at: 2016-04-23T09:29:07+00:00
[INFO] Final Memory: 34M/139M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1315 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1315/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9564 lines...]
Running org.apache.hadoop.mapred.TestJavaSerialization
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.116 sec - in org.apache.hadoop.mapred.TestJavaSerialization
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.034 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.ipc.TestMRCJCSocketFactory
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.159 sec - in org.apache.hadoop.ipc.TestMRCJCSocketFactory

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.167 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.702 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.283 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:58 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:09 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:45 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:04 h
[INFO] Finished at: 2016-04-23T00:37:08+00:00
[INFO] Final Memory: 34M/155M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1314 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1314/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9582 lines...]
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.163 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.485 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests in error: 
  TestDelegatingInputFormat.testSplitting:91 » NoClassDefFound org/apache/hadoop...

Tests run: 533, Failures: 4, Errors: 1, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.581 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.756 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.439 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:13 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:10 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:42 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:02 h
[INFO] Finished at: 2016-04-22T21:50:01+00:00
[INFO] Final Memory: 34M/149M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapreduce.lib.input.TestDelegatingInputFormat.testSplitting

Error Message:
org/apache/hadoop/util/IntrusiveCollection$IntrusiveIterator

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/util/IntrusiveCollection$IntrusiveIterator
	at org.apache.hadoop.util.IntrusiveCollection.iterator(IntrusiveCollection.java:213)
	at org.apache.hadoop.util.IntrusiveCollection.clear(IntrusiveCollection.java:368)
	at org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.clearPendingCachingCommands(DatanodeManager.java:1721)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.stopActiveServices(FSNamesystem.java:1243)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.close(FSNamesystem.java:1588)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.stopCommonServices(NameNode.java:814)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.stop(NameNode.java:993)
	at org.apache.hadoop.hdfs.MiniDFSCluster.stopAndJoinNameNode(MiniDFSCluster.java:1965)
	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1911)
	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1882)
	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1875)
	at org.apache.hadoop.mapreduce.lib.input.TestDelegatingInputFormat.testSplitting(TestDelegatingInputFormat.java:91)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1313 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1313/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8472 lines...]
Running org.apache.hadoop.mapred.TestJobAclsManager
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.892 sec - in org.apache.hadoop.mapred.TestJobAclsManager
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestLineRecordReader
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.495 sec - in org.apache.hadoop.mapred.TestLineRecordReader
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestClock
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.076 sec - in org.apache.hadoop.mapred.TestClock
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.092 sec - in org.apache.hadoop.mapred.TestJobQueueClient

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 242, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.190 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:35 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:38 min
[INFO] Finished at: 2016-04-22T18:21:16+00:00
[INFO] Final Memory: 32M/184M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk-Java8 - Build # 1312 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1312/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8473 lines...]
Running org.apache.hadoop.mapred.TestJobEndNotifier
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.877 sec - in org.apache.hadoop.mapred.TestJobEndNotifier
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestOldMethodsJobID
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.275 sec - in org.apache.hadoop.mapred.TestOldMethodsJobID
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.092 sec - in org.apache.hadoop.mapred.TestJobQueueClient
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobConf
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.443 sec - in org.apache.hadoop.mapred.TestJobConf

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 242, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.272 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:39 min
[INFO] Finished at: 2016-04-22T16:21:10+00:00
[INFO] Final Memory: 32M/186M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk-Java8 - Build # 1311 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1311/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8855 lines...]
Running org.apache.hadoop.mapreduce.v2.app.TestMRApp
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.96 sec - in org.apache.hadoop.mapreduce.v2.app.TestMRApp
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.282 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.987 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.684 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher

Results :

Tests in error: 
  TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart:2535 » IllegalState

Tests run: 344, Failures: 0, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.265 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:40 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.708 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.332 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [10:56 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13:09 min
[INFO] Finished at: 2016-04-22T11:32:58+00:00
[INFO] Final Memory: 36M/245M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart

Error Message:
InputStream#read(byte[]) returned invalid result: 0
The InputStream implementation is buggy.

Stack Trace:
java.lang.IllegalStateException: InputStream#read(byte[]) returned invalid result: 0
The InputStream implementation is buggy.
	at com.google.protobuf.CodedInputStream.refillBuffer(CodedInputStream.java:739)
	at com.google.protobuf.CodedInputStream.isAtEnd(CodedInputStream.java:701)
	at com.google.protobuf.CodedInputStream.readTag(CodedInputStream.java:99)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.<init>(SecurityProtos.java:1828)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.<init>(SecurityProtos.java:1792)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto$1.parsePartialFrom(SecurityProtos.java:1892)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto$1.parsePartialFrom(SecurityProtos.java:1887)
	at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:200)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:217)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:223)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:49)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.parseFrom(SecurityProtos.java:2100)
	at org.apache.hadoop.security.Credentials.readProtos(Credentials.java:331)
	at org.apache.hadoop.security.Credentials.readTokenStorageStream(Credentials.java:226)
	at org.apache.hadoop.yarn.server.resourcemanager.recovery.records.impl.pb.ApplicationAttemptStateDataPBImpl.convertCredentialsFromByteBuffer(ApplicationAttemptStateDataPBImpl.java:372)
	at org.apache.hadoop.yarn.server.resourcemanager.recovery.records.impl.pb.ApplicationAttemptStateDataPBImpl.getAppAttemptTokens(ApplicationAttemptStateDataPBImpl.java:152)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.attempt.RMAppAttemptImpl.recover(RMAppAttemptImpl.java:914)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl.recover(RMAppImpl.java:858)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl$RMAppRecoveredTransition.transition(RMAppImpl.java:998)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl$RMAppRecoveredTransition.transition(RMAppImpl.java:991)
	at org.apache.hadoop.yarn.state.StateMachineFactory$MultipleInternalArc.doTransition(StateMachineFactory.java:385)
	at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
	at org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
	at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl.handle(RMAppImpl.java:816)
	at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.recoverApplication(RMAppManager.java:331)
	at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.recover(RMAppManager.java:477)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.recover(ResourceManager.java:1310)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$RMActiveServices.serviceStart(ResourceManager.java:665)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startActiveServices(ResourceManager.java:1097)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1137)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1133)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.transitionToActive(ResourceManager.java:1133)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1173)
	at org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator$MyResourceManager.serviceStart(TestRMContainerAllocator.java:855)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart(TestRMContainerAllocator.java:2535)



Hadoop-Mapreduce-trunk-Java8 - Build # 1310 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1310/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8858 lines...]
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.993 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.008 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.305 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.73 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher

Results :

Tests in error: 
  TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart:2535 » IllegalState

Tests run: 344, Failures: 0, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.370 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.564 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.465 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [11:03 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13:14 min
[INFO] Finished at: 2016-04-22T05:31:17+00:00
[INFO] Final Memory: 36M/184M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart

Error Message:
InputStream#read(byte[]) returned invalid result: 0
The InputStream implementation is buggy.

Stack Trace:
java.lang.IllegalStateException: InputStream#read(byte[]) returned invalid result: 0
The InputStream implementation is buggy.
	at com.google.protobuf.CodedInputStream.refillBuffer(CodedInputStream.java:739)
	at com.google.protobuf.CodedInputStream.isAtEnd(CodedInputStream.java:701)
	at com.google.protobuf.CodedInputStream.readTag(CodedInputStream.java:99)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.<init>(SecurityProtos.java:1828)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.<init>(SecurityProtos.java:1792)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto$1.parsePartialFrom(SecurityProtos.java:1892)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto$1.parsePartialFrom(SecurityProtos.java:1887)
	at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:200)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:217)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:223)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:49)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.parseFrom(SecurityProtos.java:2100)
	at org.apache.hadoop.security.Credentials.readProtos(Credentials.java:331)
	at org.apache.hadoop.security.Credentials.readTokenStorageStream(Credentials.java:226)
	at org.apache.hadoop.yarn.server.resourcemanager.recovery.records.impl.pb.ApplicationAttemptStateDataPBImpl.convertCredentialsFromByteBuffer(ApplicationAttemptStateDataPBImpl.java:372)
	at org.apache.hadoop.yarn.server.resourcemanager.recovery.records.impl.pb.ApplicationAttemptStateDataPBImpl.getAppAttemptTokens(ApplicationAttemptStateDataPBImpl.java:152)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.attempt.RMAppAttemptImpl.recover(RMAppAttemptImpl.java:914)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl.recover(RMAppImpl.java:858)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl$RMAppRecoveredTransition.transition(RMAppImpl.java:998)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl$RMAppRecoveredTransition.transition(RMAppImpl.java:991)
	at org.apache.hadoop.yarn.state.StateMachineFactory$MultipleInternalArc.doTransition(StateMachineFactory.java:385)
	at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
	at org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
	at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl.handle(RMAppImpl.java:816)
	at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.recoverApplication(RMAppManager.java:331)
	at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.recover(RMAppManager.java:477)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.recover(ResourceManager.java:1310)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$RMActiveServices.serviceStart(ResourceManager.java:665)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startActiveServices(ResourceManager.java:1097)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1137)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1133)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.transitionToActive(ResourceManager.java:1133)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1173)
	at org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator$MyResourceManager.serviceStart(TestRMContainerAllocator.java:855)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart(TestRMContainerAllocator.java:2535)



Hadoop-Mapreduce-trunk-Java8 - Build # 1309 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1309/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8857 lines...]
	at org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart(TestRMContainerAllocator.java:2535)

Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestEvents
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.619 sec - in org.apache.hadoop.mapreduce.jobhistory.TestEvents
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.366 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.365 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler

Results :

Tests in error: 
  TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart:2535 » IllegalState

Tests run: 344, Failures: 0, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.333 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.440 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.859 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [11:08 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13:19 min
[INFO] Finished at: 2016-04-22T02:29:10+00:00
[INFO] Final Memory: 35M/204M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart

Error Message:
InputStream#read(byte[]) returned invalid result: 0
The InputStream implementation is buggy.

Stack Trace:
java.lang.IllegalStateException: InputStream#read(byte[]) returned invalid result: 0
The InputStream implementation is buggy.
	at com.google.protobuf.CodedInputStream.refillBuffer(CodedInputStream.java:739)
	at com.google.protobuf.CodedInputStream.isAtEnd(CodedInputStream.java:701)
	at com.google.protobuf.CodedInputStream.readTag(CodedInputStream.java:99)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.<init>(SecurityProtos.java:1828)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.<init>(SecurityProtos.java:1792)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto$1.parsePartialFrom(SecurityProtos.java:1892)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto$1.parsePartialFrom(SecurityProtos.java:1887)
	at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:200)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:217)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:223)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:49)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.parseFrom(SecurityProtos.java:2100)
	at org.apache.hadoop.security.Credentials.readProtos(Credentials.java:331)
	at org.apache.hadoop.security.Credentials.readTokenStorageStream(Credentials.java:226)
	at org.apache.hadoop.yarn.server.resourcemanager.recovery.records.impl.pb.ApplicationAttemptStateDataPBImpl.convertCredentialsFromByteBuffer(ApplicationAttemptStateDataPBImpl.java:372)
	at org.apache.hadoop.yarn.server.resourcemanager.recovery.records.impl.pb.ApplicationAttemptStateDataPBImpl.getAppAttemptTokens(ApplicationAttemptStateDataPBImpl.java:152)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.attempt.RMAppAttemptImpl.recover(RMAppAttemptImpl.java:914)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl.recover(RMAppImpl.java:858)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl$RMAppRecoveredTransition.transition(RMAppImpl.java:998)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl$RMAppRecoveredTransition.transition(RMAppImpl.java:991)
	at org.apache.hadoop.yarn.state.StateMachineFactory$MultipleInternalArc.doTransition(StateMachineFactory.java:385)
	at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
	at org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
	at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl.handle(RMAppImpl.java:816)
	at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.recoverApplication(RMAppManager.java:331)
	at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.recover(RMAppManager.java:477)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.recover(ResourceManager.java:1310)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$RMActiveServices.serviceStart(ResourceManager.java:665)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startActiveServices(ResourceManager.java:1097)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1137)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1133)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.transitionToActive(ResourceManager.java:1133)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1173)
	at org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator$MyResourceManager.serviceStart(TestRMContainerAllocator.java:855)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart(TestRMContainerAllocator.java:2535)



Hadoop-Mapreduce-trunk-Java8 - Build # 1308 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1308/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8855 lines...]
Running org.apache.hadoop.mapreduce.TestMapreduceConfigFields
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.632 sec - in org.apache.hadoop.mapreduce.TestMapreduceConfigFields
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.035 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.708 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.291 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor

Results :

Tests in error: 
  TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart:2535 » IllegalState

Tests run: 344, Failures: 0, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.538 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.950 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.425 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [10:59 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13:11 min
[INFO] Finished at: 2016-04-21T23:53:55+00:00
[INFO] Final Memory: 36M/207M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart

Error Message:
InputStream#read(byte[]) returned invalid result: 0
The InputStream implementation is buggy.

Stack Trace:
java.lang.IllegalStateException: InputStream#read(byte[]) returned invalid result: 0
The InputStream implementation is buggy.
	at com.google.protobuf.CodedInputStream.refillBuffer(CodedInputStream.java:739)
	at com.google.protobuf.CodedInputStream.isAtEnd(CodedInputStream.java:701)
	at com.google.protobuf.CodedInputStream.readTag(CodedInputStream.java:99)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.<init>(SecurityProtos.java:1828)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.<init>(SecurityProtos.java:1792)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto$1.parsePartialFrom(SecurityProtos.java:1892)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto$1.parsePartialFrom(SecurityProtos.java:1887)
	at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:200)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:217)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:223)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:49)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.parseFrom(SecurityProtos.java:2100)
	at org.apache.hadoop.security.Credentials.readProtos(Credentials.java:331)
	at org.apache.hadoop.security.Credentials.readTokenStorageStream(Credentials.java:226)
	at org.apache.hadoop.yarn.server.resourcemanager.recovery.records.impl.pb.ApplicationAttemptStateDataPBImpl.convertCredentialsFromByteBuffer(ApplicationAttemptStateDataPBImpl.java:372)
	at org.apache.hadoop.yarn.server.resourcemanager.recovery.records.impl.pb.ApplicationAttemptStateDataPBImpl.getAppAttemptTokens(ApplicationAttemptStateDataPBImpl.java:152)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.attempt.RMAppAttemptImpl.recover(RMAppAttemptImpl.java:914)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl.recover(RMAppImpl.java:858)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl$RMAppRecoveredTransition.transition(RMAppImpl.java:998)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl$RMAppRecoveredTransition.transition(RMAppImpl.java:991)
	at org.apache.hadoop.yarn.state.StateMachineFactory$MultipleInternalArc.doTransition(StateMachineFactory.java:385)
	at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
	at org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
	at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl.handle(RMAppImpl.java:816)
	at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.recoverApplication(RMAppManager.java:331)
	at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.recover(RMAppManager.java:477)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.recover(ResourceManager.java:1310)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$RMActiveServices.serviceStart(ResourceManager.java:665)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startActiveServices(ResourceManager.java:1097)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1137)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1133)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.transitionToActive(ResourceManager.java:1133)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1173)
	at org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator$MyResourceManager.serviceStart(TestRMContainerAllocator.java:855)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart(TestRMContainerAllocator.java:2535)



Hadoop-Mapreduce-trunk-Java8 - Build # 1307 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1307/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8858 lines...]
Running org.apache.hadoop.mapreduce.v2.app.TestFetchFailure
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.574 sec - in org.apache.hadoop.mapreduce.v2.app.TestFetchFailure
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.667 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 32.83 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestEvents
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.056 sec - in org.apache.hadoop.mapreduce.jobhistory.TestEvents

Results :

Tests in error: 
  TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart:2535 » IllegalState

Tests run: 344, Failures: 0, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  5.752 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [03:07 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 45.862 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  9.768 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [15:02 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 19:14 min
[INFO] Finished at: 2016-04-21T22:24:21+00:00
[INFO] Final Memory: 36M/187M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart

Error Message:
InputStream#read(byte[]) returned invalid result: 0
The InputStream implementation is buggy.

Stack Trace:
java.lang.IllegalStateException: InputStream#read(byte[]) returned invalid result: 0
The InputStream implementation is buggy.
	at com.google.protobuf.CodedInputStream.refillBuffer(CodedInputStream.java:739)
	at com.google.protobuf.CodedInputStream.isAtEnd(CodedInputStream.java:701)
	at com.google.protobuf.CodedInputStream.readTag(CodedInputStream.java:99)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.<init>(SecurityProtos.java:1828)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.<init>(SecurityProtos.java:1792)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto$1.parsePartialFrom(SecurityProtos.java:1892)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto$1.parsePartialFrom(SecurityProtos.java:1887)
	at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:200)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:217)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:223)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:49)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.parseFrom(SecurityProtos.java:2100)
	at org.apache.hadoop.security.Credentials.readProtos(Credentials.java:331)
	at org.apache.hadoop.security.Credentials.readTokenStorageStream(Credentials.java:226)
	at org.apache.hadoop.yarn.server.resourcemanager.recovery.records.impl.pb.ApplicationAttemptStateDataPBImpl.convertCredentialsFromByteBuffer(ApplicationAttemptStateDataPBImpl.java:372)
	at org.apache.hadoop.yarn.server.resourcemanager.recovery.records.impl.pb.ApplicationAttemptStateDataPBImpl.getAppAttemptTokens(ApplicationAttemptStateDataPBImpl.java:152)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.attempt.RMAppAttemptImpl.recover(RMAppAttemptImpl.java:914)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl.recover(RMAppImpl.java:858)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl$RMAppRecoveredTransition.transition(RMAppImpl.java:998)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl$RMAppRecoveredTransition.transition(RMAppImpl.java:991)
	at org.apache.hadoop.yarn.state.StateMachineFactory$MultipleInternalArc.doTransition(StateMachineFactory.java:385)
	at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
	at org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
	at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl.handle(RMAppImpl.java:816)
	at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.recoverApplication(RMAppManager.java:331)
	at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.recover(RMAppManager.java:477)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.recover(ResourceManager.java:1310)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$RMActiveServices.serviceStart(ResourceManager.java:665)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startActiveServices(ResourceManager.java:1097)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1137)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1133)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.transitionToActive(ResourceManager.java:1133)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1173)
	at org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator$MyResourceManager.serviceStart(TestRMContainerAllocator.java:855)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart(TestRMContainerAllocator.java:2535)



Hadoop-Mapreduce-trunk-Java8 - Build # 1306 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1306/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8867 lines...]
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs
Tests run: 21, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.01 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.367 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapreduceConfigFields
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.872 sec - in org.apache.hadoop.mapreduce.TestMapreduceConfigFields

Results :

Failed tests: 
  TestJobImpl.testUnusableNodeTransition:629->assertJobState:1012 expected:<SUCCEEDED> but was:<ERROR>

Tests in error: 
  TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart:2535 » IllegalState

Tests run: 344, Failures: 1, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  4.360 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:12 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 32.668 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.171 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [12:51 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 15:50 min
[INFO] Finished at: 2016-04-21T20:38:59+00:00
[INFO] Final Memory: 35M/205M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition

Error Message:
expected:<SUCCEEDED> but was:<ERROR>

Stack Trace:
java.lang.AssertionError: expected:<SUCCEEDED> but was:<ERROR>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.assertJobState(TestJobImpl.java:1012)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition(TestJobImpl.java:629)


FAILED:  org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart

Error Message:
InputStream#read(byte[]) returned invalid result: 0
The InputStream implementation is buggy.

Stack Trace:
java.lang.IllegalStateException: InputStream#read(byte[]) returned invalid result: 0
The InputStream implementation is buggy.
	at com.google.protobuf.CodedInputStream.refillBuffer(CodedInputStream.java:739)
	at com.google.protobuf.CodedInputStream.isAtEnd(CodedInputStream.java:701)
	at com.google.protobuf.CodedInputStream.readTag(CodedInputStream.java:99)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.<init>(SecurityProtos.java:1828)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.<init>(SecurityProtos.java:1792)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto$1.parsePartialFrom(SecurityProtos.java:1892)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto$1.parsePartialFrom(SecurityProtos.java:1887)
	at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:200)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:217)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:223)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:49)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.parseFrom(SecurityProtos.java:2100)
	at org.apache.hadoop.security.Credentials.readProtos(Credentials.java:331)
	at org.apache.hadoop.security.Credentials.readTokenStorageStream(Credentials.java:226)
	at org.apache.hadoop.yarn.server.resourcemanager.recovery.records.impl.pb.ApplicationAttemptStateDataPBImpl.convertCredentialsFromByteBuffer(ApplicationAttemptStateDataPBImpl.java:372)
	at org.apache.hadoop.yarn.server.resourcemanager.recovery.records.impl.pb.ApplicationAttemptStateDataPBImpl.getAppAttemptTokens(ApplicationAttemptStateDataPBImpl.java:152)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.attempt.RMAppAttemptImpl.recover(RMAppAttemptImpl.java:914)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl.recover(RMAppImpl.java:858)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl$RMAppRecoveredTransition.transition(RMAppImpl.java:998)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl$RMAppRecoveredTransition.transition(RMAppImpl.java:991)
	at org.apache.hadoop.yarn.state.StateMachineFactory$MultipleInternalArc.doTransition(StateMachineFactory.java:385)
	at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
	at org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
	at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl.handle(RMAppImpl.java:816)
	at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.recoverApplication(RMAppManager.java:331)
	at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.recover(RMAppManager.java:477)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.recover(ResourceManager.java:1310)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$RMActiveServices.serviceStart(ResourceManager.java:665)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startActiveServices(ResourceManager.java:1097)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1137)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1133)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.transitionToActive(ResourceManager.java:1133)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1173)
	at org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator$MyResourceManager.serviceStart(TestRMContainerAllocator.java:855)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart(TestRMContainerAllocator.java:2535)



Hadoop-Mapreduce-trunk-Java8 - Build # 1305 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1305/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8856 lines...]
Running org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt
Tests run: 26, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 24.63 sec - in org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.798 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.387 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestEvents
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.6 sec - in org.apache.hadoop.mapreduce.jobhistory.TestEvents

Results :

Tests in error: 
  TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart:2535 » IllegalState

Tests run: 344, Failures: 0, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.280 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.584 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.395 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [11:00 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13:09 min
[INFO] Finished at: 2016-04-21T19:15:02+00:00
[INFO] Final Memory: 36M/200M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart

Error Message:
InputStream#read(byte[]) returned invalid result: 0
The InputStream implementation is buggy.

Stack Trace:
java.lang.IllegalStateException: InputStream#read(byte[]) returned invalid result: 0
The InputStream implementation is buggy.
	at com.google.protobuf.CodedInputStream.refillBuffer(CodedInputStream.java:739)
	at com.google.protobuf.CodedInputStream.isAtEnd(CodedInputStream.java:701)
	at com.google.protobuf.CodedInputStream.readTag(CodedInputStream.java:99)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.<init>(SecurityProtos.java:1828)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.<init>(SecurityProtos.java:1792)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto$1.parsePartialFrom(SecurityProtos.java:1892)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto$1.parsePartialFrom(SecurityProtos.java:1887)
	at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:200)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:217)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:223)
	at com.google.protobuf.AbstractParser.parseFrom(AbstractParser.java:49)
	at org.apache.hadoop.security.proto.SecurityProtos$CredentialsProto.parseFrom(SecurityProtos.java:2100)
	at org.apache.hadoop.security.Credentials.readProtos(Credentials.java:331)
	at org.apache.hadoop.security.Credentials.readTokenStorageStream(Credentials.java:226)
	at org.apache.hadoop.yarn.server.resourcemanager.recovery.records.impl.pb.ApplicationAttemptStateDataPBImpl.convertCredentialsFromByteBuffer(ApplicationAttemptStateDataPBImpl.java:372)
	at org.apache.hadoop.yarn.server.resourcemanager.recovery.records.impl.pb.ApplicationAttemptStateDataPBImpl.getAppAttemptTokens(ApplicationAttemptStateDataPBImpl.java:152)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.attempt.RMAppAttemptImpl.recover(RMAppAttemptImpl.java:914)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl.recover(RMAppImpl.java:858)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl$RMAppRecoveredTransition.transition(RMAppImpl.java:998)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl$RMAppRecoveredTransition.transition(RMAppImpl.java:991)
	at org.apache.hadoop.yarn.state.StateMachineFactory$MultipleInternalArc.doTransition(StateMachineFactory.java:385)
	at org.apache.hadoop.yarn.state.StateMachineFactory.doTransition(StateMachineFactory.java:302)
	at org.apache.hadoop.yarn.state.StateMachineFactory.access$300(StateMachineFactory.java:46)
	at org.apache.hadoop.yarn.state.StateMachineFactory$InternalStateMachine.doTransition(StateMachineFactory.java:448)
	at org.apache.hadoop.yarn.server.resourcemanager.rmapp.RMAppImpl.handle(RMAppImpl.java:816)
	at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.recoverApplication(RMAppManager.java:331)
	at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.recover(RMAppManager.java:477)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.recover(ResourceManager.java:1310)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$RMActiveServices.serviceStart(ResourceManager.java:665)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startActiveServices(ResourceManager.java:1097)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1137)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1133)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.transitionToActive(ResourceManager.java:1133)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1173)
	at org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator$MyResourceManager.serviceStart(TestRMContainerAllocator.java:855)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator.testRMContainerAllocatorResendsRequestsOnRMRestart(TestRMContainerAllocator.java:2535)



Hadoop-Mapreduce-trunk-Java8 - Build # 1304 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1304/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9850 lines...]
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.579 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.017 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests in error: 
  TestJobCleanup.tearDown:91 » NoClassDefFound org/apache/hadoop/util/ShutdownTh...
  TestJobName>ClusterMapReduceTestCase.setUp:56->ClusterMapReduceTestCase.startCluster:87 » YarnRuntime
  TestJobName>ClusterMapReduceTestCase.setUp:56->ClusterMapReduceTestCase.startCluster:87 » YarnRuntime
  TestMRIntermediateDataEncryption.testUberMode:60->doEncryptionTest:75->doEncryptionTest:90 » YarnRuntime

Tests run: 530, Failures: 5, Errors: 3, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.367 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.671 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.380 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:29 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:11 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:49 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:09 h
[INFO] Finished at: 2016-04-21T15:44:27+00:00
[INFO] Final Memory: 38M/164M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There was a timeout or other error in the fork -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
8 tests failed.
FAILED:  org.apache.hadoop.mapred.TestJobCleanup.org.apache.hadoop.mapred.TestJobCleanup

Error Message:
org/apache/hadoop/util/ShutdownThreadsHelper

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/util/ShutdownThreadsHelper
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.serviceStop(HistoryFileManager.java:681)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistory.serviceStop(JobHistory.java:172)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:52)
	at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:80)
	at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
	at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.serviceStop(JobHistoryServer.java:208)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster$JobHistoryServerWrapper.serviceStop(MiniMRYarnCluster.java:257)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.service.ServiceOperations.stop(ServiceOperations.java:52)
	at org.apache.hadoop.service.ServiceOperations.stopQuietly(ServiceOperations.java:80)
	at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
	at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapred.MiniMRYarnClusterAdapter.stop(MiniMRYarnClusterAdapter.java:55)
	at org.apache.hadoop.mapred.MiniMRCluster.shutdown(MiniMRCluster.java:267)
	at org.apache.hadoop.mapred.TestJobCleanup.tearDown(TestJobCleanup.java:91)


FAILED:  org.apache.hadoop.mapred.TestJobName.testComplexNameWithRegex

Error Message:
java.lang.NoClassDefFoundError: org/apache/hadoop/fs/Hdfs$2

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.lang.NoClassDefFoundError: org/apache/hadoop/fs/Hdfs$2
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster$JobHistoryServerWrapper.serviceStart(MiniMRYarnCluster.java:250)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.serviceStart(MiniYARNCluster.java:292)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.serviceStart(MiniMRYarnCluster.java:191)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:80)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase$ConfigurableMiniMRCluster.<init>(ClusterMapReduceTestCase.java:101)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.startCluster(ClusterMapReduceTestCase.java:87)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.setUp(ClusterMapReduceTestCase.java:56)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/fs/Hdfs$2
	at org.apache.hadoop.fs.Hdfs.listStatusIterator(Hdfs.java:180)
	at org.apache.hadoop.fs.FileContext$22.next(FileContext.java:1492)
	at org.apache.hadoop.fs.FileContext$22.next(FileContext.java:1487)
	at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90)
	at org.apache.hadoop.fs.FileContext.listStatus(FileContext.java:1494)
	at org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:457)
	at org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:444)
	at org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:439)
	at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.findTimestampedDirectories(HistoryFileManager.java:824)
	at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.initExisting(HistoryFileManager.java:718)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistory.serviceInit(JobHistory.java:98)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:107)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.serviceInit(JobHistoryServer.java:152)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster$JobHistoryServerWrapper.serviceStart(MiniMRYarnCluster.java:232)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.serviceStart(MiniYARNCluster.java:292)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.serviceStart(MiniMRYarnCluster.java:191)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:80)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase$ConfigurableMiniMRCluster.<init>(ClusterMapReduceTestCase.java:101)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.startCluster(ClusterMapReduceTestCase.java:87)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.setUp(ClusterMapReduceTestCase.java:56)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.Hdfs$2
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.fs.Hdfs.listStatusIterator(Hdfs.java:180)
	at org.apache.hadoop.fs.FileContext$22.next(FileContext.java:1492)
	at org.apache.hadoop.fs.FileContext$22.next(FileContext.java:1487)
	at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90)
	at org.apache.hadoop.fs.FileContext.listStatus(FileContext.java:1494)
	at org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:457)
	at org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:444)
	at org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:439)
	at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.findTimestampedDirectories(HistoryFileManager.java:824)
	at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.initExisting(HistoryFileManager.java:718)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistory.serviceInit(JobHistory.java:98)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:107)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.serviceInit(JobHistoryServer.java:152)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster$JobHistoryServerWrapper.serviceStart(MiniMRYarnCluster.java:232)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.serviceStart(MiniYARNCluster.java:292)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.serviceStart(MiniMRYarnCluster.java:191)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:80)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase$ConfigurableMiniMRCluster.<init>(ClusterMapReduceTestCase.java:101)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.startCluster(ClusterMapReduceTestCase.java:87)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.setUp(ClusterMapReduceTestCase.java:56)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.mapred.TestJobName.testComplexName

Error Message:
java.lang.NoClassDefFoundError: org/apache/hadoop/fs/Hdfs$2

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.lang.NoClassDefFoundError: org/apache/hadoop/fs/Hdfs$2
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster$JobHistoryServerWrapper.serviceStart(MiniMRYarnCluster.java:250)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.serviceStart(MiniYARNCluster.java:292)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.serviceStart(MiniMRYarnCluster.java:191)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:80)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase$ConfigurableMiniMRCluster.<init>(ClusterMapReduceTestCase.java:101)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.startCluster(ClusterMapReduceTestCase.java:87)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.setUp(ClusterMapReduceTestCase.java:56)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/fs/Hdfs$2
	at org.apache.hadoop.fs.Hdfs.listStatusIterator(Hdfs.java:180)
	at org.apache.hadoop.fs.FileContext$22.next(FileContext.java:1492)
	at org.apache.hadoop.fs.FileContext$22.next(FileContext.java:1487)
	at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90)
	at org.apache.hadoop.fs.FileContext.listStatus(FileContext.java:1494)
	at org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:457)
	at org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:444)
	at org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:439)
	at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.findTimestampedDirectories(HistoryFileManager.java:824)
	at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.initExisting(HistoryFileManager.java:718)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistory.serviceInit(JobHistory.java:98)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:107)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistoryServer.serviceInit(JobHistoryServer.java:152)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster$JobHistoryServerWrapper.serviceStart(MiniMRYarnCluster.java:232)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.serviceStart(MiniYARNCluster.java:292)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.serviceStart(MiniMRYarnCluster.java:191)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:80)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase$ConfigurableMiniMRCluster.<init>(ClusterMapReduceTestCase.java:101)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.startCluster(ClusterMapReduceTestCase.java:87)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.setUp(ClusterMapReduceTestCase.java:56)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1303 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1303/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9569 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.032 sec - in org.apache.hadoop.mapred.TestMiniMRDFSCaching
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.533 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.524 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.39 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 529, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.582 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:40 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.759 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.477 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:08 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:14 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:53 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:13 h
[INFO] Finished at: 2016-04-21T09:48:16+00:00
[INFO] Final Memory: 34M/163M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There was a timeout or other error in the fork -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1302 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1302/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8474 lines...]
Running org.apache.hadoop.mapred.TestJobAclsManager
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.917 sec - in org.apache.hadoop.mapred.TestJobAclsManager
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestLineRecordReader
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.841 sec - in org.apache.hadoop.mapred.TestLineRecordReader
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestClock
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.079 sec - in org.apache.hadoop.mapred.TestClock
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.091 sec - in org.apache.hadoop.mapred.TestJobQueueClient

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 242, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.163 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:40 min
[INFO] Finished at: 2016-04-21T06:21:07+00:00
[INFO] Final Memory: 32M/203M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk-Java8 - Build # 1301 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1301/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9565 lines...]
Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 71.492 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.578 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.457 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.158 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.524 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.206 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:08 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:09 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:43 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:02 h
[INFO] Finished at: 2016-04-21T06:01:54+00:00
[INFO] Final Memory: 34M/135M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1300 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1300/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9578 lines...]
Running org.apache.hadoop.mapreduce.v2.TestMRJobsWithProfiler

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests in error: 
  TestReduceFetchFromPartialMem.setUp:54 » YarnRuntime org.apache.hadoop.yarn.ex...
  TestMRIntermediateDataEncryption.testUberMode:60->doEncryptionTest:75->doEncryptionTest:90 » NoClassDefFound
  TestMRIntermediateDataEncryption.testMultipleMapsPerNode:65->doEncryptionTest:75->doEncryptionTest:90 » NoClassDefFound
  TestMRIntermediateDataEncryption.testMultipleReducers:70->doEncryptionTest:75->doEncryptionTest:90 » NoClassDefFound
  TestMRIntermediateDataEncryption.testSingleReducer:55->doEncryptionTest:75->doEncryptionTest:90 » NoClassDefFound
  TestMRJobsWithHistoryService.testJobHistoryData:153 » IO java.io.IOException: ...

Tests run: 363, Failures: 4, Errors: 6, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.467 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.560 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.224 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:02 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:10 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:32 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:52 h
[INFO] Finished at: 2016-04-21T03:40:39+00:00
[INFO] Final Memory: 39M/142M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter6053755621527023841.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire7628787607766364785tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_1887739250710314358259tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
10 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRIntermediateDataEncryption.testUberMode

Error Message:
org/apache/hadoop/yarn/server/utils/BuilderUtils

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/server/utils/BuilderUtils
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.yarn.server.resourcemanager.RMServerUtils.<clinit>(RMServerUtils.java:423)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceInit(ResourceManager.java:237)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.initResourceManager(MiniYARNCluster.java:318)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.access$200(MiniYARNCluster.java:112)
	at org.apache.hadoop.yarn.server.MiniYARNCluster$ResourceManagerWrapper.serviceInit(MiniYARNCluster.java:458)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:107)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.serviceInit(MiniYARNCluster.java:286)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.serviceInit(MiniMRYarnCluster.java:186)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:79)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:41)
	at org.apache.hadoop.mapred.TestMRIntermediateDataEncryption.doEncryptionTest(TestMRIntermediateDataEncryption.java:90)
	at org.apache.hadoop.mapred.TestMRIntermediateDataEncryption.doEncryptionTest(TestMRIntermediateDataEncryption.java:75)
	at org.apache.hadoop.mapred.TestMRIntermediateDataEncryption.testUberMode(TestMRIntermediateDataEncryption.java:60)


FAILED:  org.apache.hadoop.mapred.TestMRIntermediateDataEncryption.testMultipleMapsPerNode

Error Message:
Could not initialize class org.apache.hadoop.yarn.server.resourcemanager.RMServerUtils

Stack Trace:
java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.yarn.server.resourcemanager.RMServerUtils
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceInit(ResourceManager.java:237)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.initResourceManager(MiniYARNCluster.java:318)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.access$200(MiniYARNCluster.java:112)
	at org.apache.hadoop.yarn.server.MiniYARNCluster$ResourceManagerWrapper.serviceInit(MiniYARNCluster.java:458)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:107)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.serviceInit(MiniYARNCluster.java:286)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.serviceInit(MiniMRYarnCluster.java:186)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:79)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:41)
	at org.apache.hadoop.mapred.TestMRIntermediateDataEncryption.doEncryptionTest(TestMRIntermediateDataEncryption.java:90)
	at org.apache.hadoop.mapred.TestMRIntermediateDataEncryption.doEncryptionTest(TestMRIntermediateDataEncryption.java:75)
	at org.apache.hadoop.mapred.TestMRIntermediateDataEncryption.testMultipleMapsPerNode(TestMRIntermediateDataEncryption.java:65)


FAILED:  org.apache.hadoop.mapred.TestMRIntermediateDataEncryption.testMultipleReducers

Error Message:
Could not initialize class org.apache.hadoop.yarn.server.resourcemanager.RMServerUtils

Stack Trace:
java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.yarn.server.resourcemanager.RMServerUtils
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceInit(ResourceManager.java:237)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.initResourceManager(MiniYARNCluster.java:318)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.access$200(MiniYARNCluster.java:112)
	at org.apache.hadoop.yarn.server.MiniYARNCluster$ResourceManagerWrapper.serviceInit(MiniYARNCluster.java:458)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:107)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.serviceInit(MiniYARNCluster.java:286)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.serviceInit(MiniMRYarnCluster.java:186)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:79)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:41)
	at org.apache.hadoop.mapred.TestMRIntermediateDataEncryption.doEncryptionTest(TestMRIntermediateDataEncryption.java:90)
	at org.apache.hadoop.mapred.TestMRIntermediateDataEncryption.doEncryptionTest(TestMRIntermediateDataEncryption.java:75)
	at org.apache.hadoop.mapred.TestMRIntermediateDataEncryption.testMultipleReducers(TestMRIntermediateDataEncryption.java:70)


FAILED:  org.apache.hadoop.mapred.TestMRIntermediateDataEncryption.testSingleReducer

Error Message:
Could not initialize class org.apache.hadoop.yarn.server.resourcemanager.RMServerUtils

Stack Trace:
java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.yarn.server.resourcemanager.RMServerUtils
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceInit(ResourceManager.java:237)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.initResourceManager(MiniYARNCluster.java:318)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.access$200(MiniYARNCluster.java:112)
	at org.apache.hadoop.yarn.server.MiniYARNCluster$ResourceManagerWrapper.serviceInit(MiniYARNCluster.java:458)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:107)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.serviceInit(MiniYARNCluster.java:286)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.serviceInit(MiniMRYarnCluster.java:186)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:79)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:41)
	at org.apache.hadoop.mapred.TestMRIntermediateDataEncryption.doEncryptionTest(TestMRIntermediateDataEncryption.java:90)
	at org.apache.hadoop.mapred.TestMRIntermediateDataEncryption.doEncryptionTest(TestMRIntermediateDataEncryption.java:75)
	at org.apache.hadoop.mapred.TestMRIntermediateDataEncryption.testSingleReducer(TestMRIntermediateDataEncryption.java:55)


FAILED:  org.apache.hadoop.mapred.TestReduceFetchFromPartialMem.testReduceFromPartialMem

Error Message:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Failed to load class: [org.apache.hadoop.yarn.server.api.impl.pb.service.ResourceManagerAdministrationProtocolPBServiceImpl]

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Failed to load class: [org.apache.hadoop.yarn.server.api.impl.pb.service.ResourceManagerAdministrationProtocolPBServiceImpl]
	at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2178)
	at org.apache.hadoop.yarn.factories.impl.pb.RpcServerFactoryPBImpl.getServer(RpcServerFactoryPBImpl.java:84)
	at org.apache.hadoop.yarn.ipc.HadoopYarnProtoRPC.getServer(HadoopYarnProtoRPC.java:65)
	at org.apache.hadoop.yarn.ipc.YarnRPC.getServer(YarnRPC.java:54)
	at org.apache.hadoop.yarn.server.resourcemanager.AdminService.startServer(AdminService.java:188)
	at org.apache.hadoop.yarn.server.resourcemanager.AdminService.serviceStart(AdminService.java:175)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1182)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.startResourceManager(MiniYARNCluster.java:335)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.access$300(MiniYARNCluster.java:112)
	at org.apache.hadoop.yarn.server.MiniYARNCluster$ResourceManagerWrapper.serviceStart(MiniYARNCluster.java:464)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.serviceStart(MiniYARNCluster.java:292)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.serviceStart(MiniMRYarnCluster.java:191)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:80)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:159)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:152)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:145)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:138)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:133)
	at org.apache.hadoop.mapred.TestReduceFetchFromPartialMem.setUp(TestReduceFetchFromPartialMem.java:54)


FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobsWithHistoryService.testJobHistoryData

Error Message:
java.io.IOException: Unknown Job job_1461209899959_0001
 at org.apache.hadoop.mapreduce.v2.hs.HistoryClientService$HSClientProtocolHandler.verifyAndGetJob(HistoryClientService.java:219)
 at org.apache.hadoop.mapreduce.v2.hs.HistoryClientService$HSClientProtocolHandler.getCounters(HistoryClientService.java:233)
 at org.apache.hadoop.mapreduce.v2.api.impl.pb.service.MRClientProtocolPBServiceImpl.getCounters(MRClientProtocolPBServiceImpl.java:159)
 at org.apache.hadoop.yarn.proto.MRClientProtocol$MRClientProtocolService$2.callBlockingMethod(MRClientProtocol.java:281)
 at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:637)
 at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2423)
 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2419)
 at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:422)
 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
 at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2417)


Stack Trace:
java.io.IOException: java.io.IOException: Unknown Job job_1461209899959_0001
	at org.apache.hadoop.mapreduce.v2.hs.HistoryClientService$HSClientProtocolHandler.verifyAndGetJob(HistoryClientService.java:219)
	at org.apache.hadoop.mapreduce.v2.hs.HistoryClientService$HSClientProtocolHandler.getCounters(HistoryClientService.java:233)
	at org.apache.hadoop.mapreduce.v2.api.impl.pb.service.MRClientProtocolPBServiceImpl.getCounters(MRClientProtocolPBServiceImpl.java:159)
	at org.apache.hadoop.yarn.proto.MRClientProtocol$MRClientProtocolService$2.callBlockingMethod(MRClientProtocol.java:281)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:637)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:989)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2423)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2419)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2417)

	at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1443)
	at org.apache.hadoop.ipc.Client.call(Client.java:1402)
	at org.apache.hadoop.ipc.Client.call(Client.java:1352)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:230)
	at com.sun.proxy.$Proxy91.getCounters(Unknown Source)
	at org.apache.hadoop.mapreduce.v2.api.impl.pb.client.MRClientProtocolPBClientImpl.getCounters(MRClientProtocolPBClientImpl.java:166)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.apache.hadoop.mapred.ClientServiceDelegate.invoke(ClientServiceDelegate.java:325)
	at org.apache.hadoop.mapred.ClientServiceDelegate.getJobCounters(ClientServiceDelegate.java:382)
	at org.apache.hadoop.mapred.YARNRunner.getJobCounters(YARNRunner.java:607)
	at org.apache.hadoop.mapreduce.Job$8.run(Job.java:812)
	at org.apache.hadoop.mapreduce.Job$8.run(Job.java:809)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
	at org.apache.hadoop.mapreduce.Job.getCounters(Job.java:809)
	at org.apache.hadoop.mapreduce.v2.TestMRJobsWithHistoryService.testJobHistoryData(TestMRJobsWithHistoryService.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1299 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1299/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8829 lines...]
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.04 sec - in org.apache.hadoop.mapreduce.v2.app.TestFetchFailure
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.7 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 33.656 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestEvents
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.245 sec - in org.apache.hadoop.mapreduce.jobhistory.TestEvents

Results :

Failed tests: 
  TestKill.testKillTask:119 Job state is not correct (timedout) expected:<SUCCEEDED> but was:<ERROR>
  TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned:190->testTaskAttemptAssignedKilledHistory:403 No Ta Started JH Event

Tests run: 344, Failures: 2, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  5.710 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [03:08 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 45.921 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  9.504 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [15:33 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 19:44 min
[INFO] Finished at: 2016-04-20T23:15:59+00:00
[INFO] Final Memory: 36M/205M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.TestKill.testKillTask

Error Message:
Job state is not correct (timedout) expected:<SUCCEEDED> but was:<ERROR>

Stack Trace:
java.lang.AssertionError: Job state is not correct (timedout) expected:<SUCCEEDED> but was:<ERROR>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.waitForState(MRApp.java:416)
	at org.apache.hadoop.mapreduce.v2.app.TestKill.testKillTask(TestKill.java:119)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned

Error Message:
No Ta Started JH Event

Stack Trace:
java.lang.AssertionError: No Ta Started JH Event
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testTaskAttemptAssignedKilledHistory(TestTaskAttempt.java:403)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned(TestTaskAttempt.java:190)



Hadoop-Mapreduce-trunk-Java8 - Build # 1298 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1298/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9564 lines...]
Running org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.475 sec - in org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.063 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.759 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.222 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:39 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.747 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.494 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:11 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:09 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:41 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:01 h
[INFO] Finished at: 2016-04-20T21:56:14+00:00
[INFO] Final Memory: 34M/169M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1297 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1297/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9600 lines...]
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.519 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.992 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests in error: 
  TestMiniMRClientCluster.testRestart:114 » YarnRuntime org.apache.hadoop.yarn.w...

Tests run: 533, Failures: 5, Errors: 1, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.335 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.493 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.387 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:10 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:15 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:49 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:09 h
[INFO] Finished at: 2016-04-20T19:28:53+00:00
[INFO] Final Memory: 34M/159M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapred.TestMiniMRClientCluster.testRestart

Error Message:
org.apache.hadoop.yarn.webapp.WebAppException: Error starting http server

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: org.apache.hadoop.yarn.webapp.WebAppException: Error starting http server
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.mortbay.jetty.nio.SelectChannelConnector.open(SelectChannelConnector.java:216)
	at org.apache.hadoop.http.HttpServer2.openListeners(HttpServer2.java:915)
	at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:857)
	at org.apache.hadoop.yarn.webapp.WebApps$Builder.start(WebApps.java:348)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startWepApp(ResourceManager.java:1078)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1176)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.startResourceManager(MiniYARNCluster.java:335)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.access$300(MiniYARNCluster.java:112)
	at org.apache.hadoop.yarn.server.MiniYARNCluster$ResourceManagerWrapper.serviceStart(MiniYARNCluster.java:464)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.serviceStart(MiniYARNCluster.java:292)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.serviceStart(MiniMRYarnCluster.java:191)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapred.MiniMRYarnClusterAdapter.restart(MiniMRYarnClusterAdapter.java:73)
	at org.apache.hadoop.mapred.TestMiniMRClientCluster.testRestart(TestMiniMRClientCluster.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1296 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1296/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9563 lines...]
Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 71.463 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.905 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.542 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.209 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.446 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.291 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:55 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:07 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:42 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:01 h
[INFO] Finished at: 2016-04-19T21:46:04+00:00
[INFO] Final Memory: 34M/137M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1295 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1295/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9564 lines...]
Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 79.27 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 27.01 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.586 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.215 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.744 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.291 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:55 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:10 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:44 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:03 h
[INFO] Finished at: 2016-04-19T13:23:09+00:00
[INFO] Final Memory: 34M/141M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1294 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1294/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8474 lines...]
Running org.apache.hadoop.mapreduce.lib.input.TestLineRecordReader
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.204 sec - in org.apache.hadoop.mapreduce.lib.input.TestLineRecordReader
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.input.TestCombineFileRecordReader
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.245 sec - in org.apache.hadoop.mapreduce.lib.input.TestCombineFileRecordReader
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.partition.TestRehashPartitioner
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.188 sec - in org.apache.hadoop.mapreduce.lib.partition.TestRehashPartitioner
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestHistoryViewerPrinter
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.67 sec - in org.apache.hadoop.mapreduce.jobhistory.TestHistoryViewerPrinter

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 242, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.248 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:35 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:39 min
[INFO] Finished at: 2016-04-19T10:21:47+00:00
[INFO] Final Memory: 31M/211M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk-Java8 - Build # 1293 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1293/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8475 lines...]
Running org.apache.hadoop.mapreduce.lib.output.TestFileOutputCommitter
Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.608 sec - in org.apache.hadoop.mapreduce.lib.output.TestFileOutputCommitter
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.output.TestMapFileOutputFormat
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.697 sec - in org.apache.hadoop.mapreduce.lib.output.TestMapFileOutputFormat
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.output.TestPreemptableFileOutputCommitter
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.875 sec - in org.apache.hadoop.mapreduce.lib.output.TestPreemptableFileOutputCommitter
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.output.TestFileOutputFormat
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.345 sec - in org.apache.hadoop.mapreduce.lib.output.TestFileOutputFormat

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 242, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  5.755 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [03:02 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:10 min
[INFO] Finished at: 2016-04-19T08:26:22+00:00
[INFO] Final Memory: 32M/209M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk-Java8 - Build # 1292 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1292/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8828 lines...]
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.934 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs
Tests run: 21, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.175 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.039 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapreduceConfigFields
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.782 sec - in org.apache.hadoop.mapreduce.TestMapreduceConfigFields

Results :

Failed tests: 
  TestKill.testKillJob:84 Task state not correct expected:<KILLED> but was:<NEW>
  TestJobImpl.testUnusableNodeTransition:629->assertJobState:1012 expected:<SUCCEEDED> but was:<ERROR>

Tests run: 344, Failures: 2, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.395 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:03 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 30.431 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.171 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [12:21 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 15:07 min
[INFO] Finished at: 2016-04-19T01:35:54+00:00
[INFO] Final Memory: 35M/201M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob

Error Message:
Task state not correct expected:<KILLED> but was:<NEW>

Stack Trace:
java.lang.AssertionError: Task state not correct expected:<KILLED> but was:<NEW>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob(TestKill.java:84)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition

Error Message:
expected:<SUCCEEDED> but was:<ERROR>

Stack Trace:
java.lang.AssertionError: expected:<SUCCEEDED> but was:<ERROR>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.assertJobState(TestJobImpl.java:1012)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition(TestJobImpl.java:629)



Hadoop-Mapreduce-trunk-Java8 - Build # 1291 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1291/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8473 lines...]
Running org.apache.hadoop.mapred.TestJobAclsManager
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.914 sec - in org.apache.hadoop.mapred.TestJobAclsManager
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestLineRecordReader
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.373 sec - in org.apache.hadoop.mapred.TestLineRecordReader
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestClock
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.082 sec - in org.apache.hadoop.mapred.TestClock
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.087 sec - in org.apache.hadoop.mapred.TestJobQueueClient

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 242, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.362 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:40 min
[INFO] Finished at: 2016-04-18T22:15:26+00:00
[INFO] Final Memory: 32M/185M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk-Java8 - Build # 1290 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1290/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9564 lines...]
Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 71.699 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 24.017 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.515 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.257 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:40 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.389 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.474 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:02 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:09 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:49 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:08 h
[INFO] Finished at: 2016-04-18T21:45:25+00:00
[INFO] Final Memory: 34M/148M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1289 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1289/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8951 lines...]
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.755 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.315 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor

Results :

Tests in error: 
  TestMRApp.testJobRebootNotLastRetryOnUnregistrationFailure:503 » YarnRuntime c...
  TestMRApp.testZeroMapReduces:99 » YarnRuntime could not cleanup test dir
  TestMRApp.testCountersOnJobFinish:566 » YarnRuntime could not cleanup test dir
  TestMRApp.testZeroMaps:91 » YarnRuntime could not cleanup test dir
  TestMRApp.testMapReduce:82 » YarnRuntime could not cleanup test dir
  TestMRApp.testJobRebootOnLastRetryOnUnregistrationFailure:526 » YarnRuntime co...
  TestRMContainerAllocator.testAttemptNotFoundCausesRMCommunicatorException »  U...
  TestRMContainerAllocator.testAMRMTokenUpdate:2691 » NoClassDefFound org/apache...

Tests run: 318, Failures: 0, Errors: 6, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.494 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:39 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 25.642 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.482 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [09:53 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 12:05 min
[INFO] Finished at: 2016-04-18T13:42:26+00:00
[INFO] Final Memory: 35M/196M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire/surefirebooter3185919009971322475.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire/surefire7123883694347492060tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire/surefire_105956594999218940568tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.TestMRApp.testJobRebootNotLastRetryOnUnregistrationFailure

Error Message:
could not cleanup test dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: could not cleanup test dir
	at org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:161)
	at org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:250)
	at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:332)
	at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:329)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
	at org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:329)
	at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:446)
	at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:423)
	at org.apache.hadoop.fs.FileContext.getLocalFSFileContext(FileContext.java:409)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:243)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:212)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:193)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:154)
	at org.apache.hadoop.mapreduce.v2.app.TestMRApp.testJobRebootNotLastRetryOnUnregistrationFailure(TestMRApp.java:503)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestMRApp.testZeroMapReduces

Error Message:
could not cleanup test dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: could not cleanup test dir
	at org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:161)
	at org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:250)
	at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:332)
	at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:329)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
	at org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:329)
	at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:446)
	at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:423)
	at org.apache.hadoop.fs.FileContext.getLocalFSFileContext(FileContext.java:409)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:243)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:212)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:193)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:154)
	at org.apache.hadoop.mapreduce.v2.app.TestMRApp.testZeroMapReduces(TestMRApp.java:99)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestMRApp.testCountersOnJobFinish

Error Message:
could not cleanup test dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: could not cleanup test dir
	at org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:161)
	at org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:250)
	at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:332)
	at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:329)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
	at org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:329)
	at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:446)
	at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:423)
	at org.apache.hadoop.fs.FileContext.getLocalFSFileContext(FileContext.java:409)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:243)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:212)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:193)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:154)
	at org.apache.hadoop.mapreduce.v2.app.TestMRApp$MRAppWithSpiedJob.<init>(TestMRApp.java:551)
	at org.apache.hadoop.mapreduce.v2.app.TestMRApp$MRAppWithSpiedJob.<init>(TestMRApp.java:546)
	at org.apache.hadoop.mapreduce.v2.app.TestMRApp.testCountersOnJobFinish(TestMRApp.java:566)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestMRApp.testZeroMaps

Error Message:
could not cleanup test dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: could not cleanup test dir
	at org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:161)
	at org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:250)
	at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:332)
	at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:329)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
	at org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:329)
	at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:446)
	at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:423)
	at org.apache.hadoop.fs.FileContext.getLocalFSFileContext(FileContext.java:409)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:243)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:212)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:193)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:154)
	at org.apache.hadoop.mapreduce.v2.app.TestMRApp.testZeroMaps(TestMRApp.java:91)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestMRApp.testMapReduce

Error Message:
could not cleanup test dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: could not cleanup test dir
	at org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:161)
	at org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:250)
	at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:332)
	at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:329)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
	at org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:329)
	at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:446)
	at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:423)
	at org.apache.hadoop.fs.FileContext.getLocalFSFileContext(FileContext.java:409)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:243)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:212)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:193)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:154)
	at org.apache.hadoop.mapreduce.v2.app.TestMRApp.testMapReduce(TestMRApp.java:82)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestMRApp.testJobRebootOnLastRetryOnUnregistrationFailure

Error Message:
could not cleanup test dir

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: could not cleanup test dir
	at org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:161)
	at org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:250)
	at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:332)
	at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:329)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
	at org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:329)
	at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:446)
	at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:423)
	at org.apache.hadoop.fs.FileContext.getLocalFSFileContext(FileContext.java:409)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:243)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:205)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.<init>(MRApp.java:199)
	at org.apache.hadoop.mapreduce.v2.app.TestMRApp.testJobRebootOnLastRetryOnUnregistrationFailure(TestMRApp.java:526)



Hadoop-Mapreduce-trunk-Java8 - Build # 1288 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1288/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8493 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobEndNotifier
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.174 sec - in org.apache.hadoop.mapred.TestJobEndNotifier
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestOldMethodsJobID
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.255 sec - in org.apache.hadoop.mapred.TestOldMethodsJobID
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.086 sec - in org.apache.hadoop.mapred.TestJobQueueClient
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobConf
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.473 sec - in org.apache.hadoop.mapred.TestJobConf

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 241, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.436 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:39 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:42 min
[INFO] Finished at: 2016-04-18T02:21:10+00:00
[INFO] Final Memory: 31M/231M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire/surefirebooter1041886820080947348.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire/surefire4056072975028612414tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire/surefire_185183395831094167955tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk-Java8 - Build # 1287 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1287/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9575 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.522 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.587 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.858 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.185 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.120 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.321 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:55 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:07 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:50 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:09 h
[INFO] Finished at: 2016-04-17T08:28:38+00:00
[INFO] Final Memory: 34M/152M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1286 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1286/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9572 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.49 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.481 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.715 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.219 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.076 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.264 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:03 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:07 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:48 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:07 h
[INFO] Finished at: 2016-04-17T05:27:23+00:00
[INFO] Final Memory: 34M/149M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1285 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1285/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8819 lines...]
Running org.apache.hadoop.mapreduce.v2.app.TestMRApp
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.973 sec - in org.apache.hadoop.mapreduce.v2.app.TestMRApp
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.296 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.001 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.675 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher

Results :

Failed tests: 
  TestJobImpl.testUnusableNodeTransition:629->assertJobState:1012 expected:<SUCCEEDED> but was:<ERROR>

Tests run: 344, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.351 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.006 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.104 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [10:55 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13:03 min
[INFO] Finished at: 2016-04-16T22:32:35+00:00
[INFO] Final Memory: 36M/228M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition

Error Message:
expected:<SUCCEEDED> but was:<ERROR>

Stack Trace:
java.lang.AssertionError: expected:<SUCCEEDED> but was:<ERROR>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.assertJobState(TestJobImpl.java:1012)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition(TestJobImpl.java:629)



Hadoop-Mapreduce-trunk-Java8 - Build # 1284 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1284/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 10580 lines...]
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesTasks.<init>:111->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer

Tests run: 344, Failures: 0, Errors: 77, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.353 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.444 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.431 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [10:01 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 12:11 min
[INFO] Finished at: 2016-04-16T00:39:23+00:00
[INFO] Final Memory: 35M/183M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
77 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInvalidUri2

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testAMXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInfo

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInfoDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testAM

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInvalidAccept

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testBlacklistedNodesXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testBlacklistedNodes

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testAMDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInfoXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInvalidUri

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInfoSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testAMSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.testGetTaskAttemptIdXMLState

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.<init>(TestAMWebServicesAttempt.java:130)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.testPutTaskAttemptIdState

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.<init>(TestAMWebServicesAttempt.java:130)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.testGetTaskAttemptIdState

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.<init>(TestAMWebServicesAttempt.java:130)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.testPutTaskAttemptIdXMLState

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.<init>(TestAMWebServicesAttempt.java:130)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptsXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdInvalid2

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdInvalid3

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdInvalid

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptId

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdNonExist

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptsDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttempts

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdXMLCounters

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdCounters

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptsSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdBogus

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.testJobConf

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.<init>(TestAMWebServicesJobConf.java:151)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.testJobConfXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.<init>(TestAMWebServicesJobConf.java:151)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.testJobConfSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.<init>(TestAMWebServicesJobConf.java:151)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.testJobConfDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.<init>(TestAMWebServicesJobConf.java:151)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobCountersXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobAttemptsXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobsXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobsDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobsSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobId

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobs

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdInvalidXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdInvalidBogus

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobAttemptsSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobCountersSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobCounters

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdInvalid

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobAttempts

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdInvalidDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobCountersDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdNonExist

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobAttemptsDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdCounters

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdBogus

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testJobTaskCountersXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksQueryReduce

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasks

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksQueryMap

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdInvalid2

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdInvalid3

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdCountersSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdNonExist

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdInvalid

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdCountersDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksQueryInvalid

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskId

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)



Hadoop-Mapreduce-trunk-Java8 - Build # 1283 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1283/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9564 lines...]
Running org.apache.hadoop.mapred.TestJavaSerialization
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.928 sec - in org.apache.hadoop.mapred.TestJavaSerialization
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.268 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.ipc.TestMRCJCSocketFactory
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.302 sec - in org.apache.hadoop.ipc.TestMRCJCSocketFactory

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.261 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.403 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.237 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:04 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:13 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:47 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:07 h
[INFO] Finished at: 2016-04-15T23:35:36+00:00
[INFO] Final Memory: 34M/148M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1282 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1282/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8474 lines...]
Running org.apache.hadoop.mapred.TestJobEndNotifier
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.945 sec - in org.apache.hadoop.mapred.TestJobEndNotifier
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestOldMethodsJobID
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.277 sec - in org.apache.hadoop.mapred.TestOldMethodsJobID
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.085 sec - in org.apache.hadoop.mapred.TestJobQueueClient
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobConf
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.537 sec - in org.apache.hadoop.mapred.TestJobConf

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 242, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.435 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:42 min
[INFO] Finished at: 2016-04-15T19:53:15+00:00
[INFO] Final Memory: 31M/211M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk-Java8 - Build # 1281 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1281/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9568 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.03 sec - in org.apache.hadoop.mapred.TestMiniMRDFSCaching
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.561 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.581 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.589 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 532, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.364 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.793 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.507 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:01 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:11 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:54 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:13 h
[INFO] Finished at: 2016-04-15T19:33:21+00:00
[INFO] Final Memory: 37M/160M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There was a timeout or other error in the fork -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1280 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1280/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8818 lines...]
Running org.apache.hadoop.mapreduce.v2.app.TestMRApp
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.813 sec - in org.apache.hadoop.mapreduce.v2.app.TestMRApp
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.324 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.074 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.786 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher

Results :

Failed tests: 
  TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned:190->testTaskAttemptAssignedKilledHistory:403 No Ta Started JH Event

Tests run: 344, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.694 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:55 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 29.153 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.471 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [11:51 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 14:25 min
[INFO] Finished at: 2016-04-15T06:42:31+00:00
[INFO] Final Memory: 36M/186M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned

Error Message:
No Ta Started JH Event

Stack Trace:
java.lang.AssertionError: No Ta Started JH Event
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testTaskAttemptAssignedKilledHistory(TestTaskAttempt.java:403)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned(TestTaskAttempt.java:190)



Hadoop-Mapreduce-trunk-Java8 - Build # 1279 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1279/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8820 lines...]
Running org.apache.hadoop.mapreduce.TestMapreduceConfigFields
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.772 sec - in org.apache.hadoop.mapreduce.TestMapreduceConfigFields
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.199 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.851 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.289 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor

Results :

Failed tests: 
  TestRecovery.testCrashed:188 TaskAttempt state is not correct (timedout) expected:<FAILED> but was:<STARTING>

Tests run: 344, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.168 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:39 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.051 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.309 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [11:23 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13:33 min
[INFO] Finished at: 2016-04-14T22:05:51+00:00
[INFO] Final Memory: 36M/205M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.TestRecovery.testCrashed

Error Message:
TaskAttempt state is not correct (timedout) expected:<FAILED> but was:<STARTING>

Stack Trace:
java.lang.AssertionError: TaskAttempt state is not correct (timedout) expected:<FAILED> but was:<STARTING>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.waitForState(MRApp.java:382)
	at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testCrashed(TestRecovery.java:188)



Hadoop-Mapreduce-trunk-Java8 - Build # 1278 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1278/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9572 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.533 sec - in org.apache.hadoop.util.TestMRCJCReflectionUtils
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.util.TestMRCJCRunJar
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.264 sec - in org.apache.hadoop.util.TestMRCJCRunJar
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.648 sec - in org.apache.hadoop.hdfs.TestNNBench

Results :

Failed tests: 
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests run: 533, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.258 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.071 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.151 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:10 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:13 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:44 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:03 h
[INFO] Finished at: 2016-04-14T21:24:51+00:00
[INFO] Final Memory: 34M/153M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1277 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1277/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8826 lines...]
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.295 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs
Tests run: 21, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.474 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.952 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapreduceConfigFields
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.592 sec - in org.apache.hadoop.mapreduce.TestMapreduceConfigFields

Results :

Failed tests: 
  TestKill.testKillJob:84 Task state not correct expected:<KILLED> but was:<NEW>
  TestKill.testKillTask:119 Job state is not correct (timedout) expected:<SUCCEEDED> but was:<ERROR>

Tests run: 344, Failures: 2, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  4.817 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:34 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 42.601 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [ 11.742 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [14:34 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 18:09 min
[INFO] Finished at: 2016-04-14T18:41:36+00:00
[INFO] Final Memory: 36M/189M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob

Error Message:
Task state not correct expected:<KILLED> but was:<NEW>

Stack Trace:
java.lang.AssertionError: Task state not correct expected:<KILLED> but was:<NEW>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob(TestKill.java:84)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestKill.testKillTask

Error Message:
Job state is not correct (timedout) expected:<SUCCEEDED> but was:<ERROR>

Stack Trace:
java.lang.AssertionError: Job state is not correct (timedout) expected:<SUCCEEDED> but was:<ERROR>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.waitForState(MRApp.java:416)
	at org.apache.hadoop.mapreduce.v2.app.TestKill.testKillTask(TestKill.java:119)



Hadoop-Mapreduce-trunk-Java8 - Build # 1276 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1276/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8818 lines...]
Running org.apache.hadoop.mapreduce.TestMapreduceConfigFields
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.608 sec - in org.apache.hadoop.mapreduce.TestMapreduceConfigFields
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.008 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.626 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.262 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor

Results :

Failed tests: 
  TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned:186->testTaskAttemptAssignedKilledHistory:399 No Ta Started JH Event

Tests run: 340, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.158 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.321 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.436 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [10:56 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13:04 min
[INFO] Finished at: 2016-04-14T15:05:52+00:00
[INFO] Final Memory: 35M/205M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned

Error Message:
No Ta Started JH Event

Stack Trace:
java.lang.AssertionError: No Ta Started JH Event
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testTaskAttemptAssignedKilledHistory(TestTaskAttempt.java:399)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned(TestTaskAttempt.java:186)



Hadoop-Mapreduce-trunk-Java8 - Build # 1275 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1275/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9564 lines...]
Running org.apache.hadoop.mapred.TestJavaSerialization
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.935 sec - in org.apache.hadoop.mapred.TestJavaSerialization
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.566 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.ipc.TestMRCJCSocketFactory
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.129 sec - in org.apache.hadoop.ipc.TestMRCJCSocketFactory

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.338 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:39 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 25.217 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.509 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:04 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:12 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:42 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:01 h
[INFO] Finished at: 2016-04-14T14:47:39+00:00
[INFO] Final Memory: 34M/151M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1274 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1274/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9563 lines...]
Running org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 73.85 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.631 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.478 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.349 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:39 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 25.315 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.421 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:05 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:08 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:44 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:03 h
[INFO] Finished at: 2016-04-14T12:23:14+00:00
[INFO] Final Memory: 34M/167M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1273 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1273/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 10581 lines...]
  TestAMWebServices.<init>:104->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServices.<init>:104->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServices.<init>:104->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServices.<init>:104->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServices.<init>:104->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServices.<init>:104->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServices.<init>:104->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServices.<init>:104->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesAttempt.<init>:130->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesAttempt.<init>:130->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesAttempt.<init>:130->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesAttempt.<init>:130->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesJobConf.<init>:151->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesJobConf.<init>:151->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesJobConf.<init>:151->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer
  TestAMWebServicesJobConf.<init>:151->JerseyTest.<init>:217->JerseyTest.getContainer:342 » TestContainer

Tests run: 340, Failures: 0, Errors: 77, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.116 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.778 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.403 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [09:39 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 11:50 min
[INFO] Finished at: 2016-04-14T03:30:49+00:00
[INFO] Final Memory: 36M/188M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
77 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInvalidUri2

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testAMXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInfo

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInfoDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testAM

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInvalidAccept

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testBlacklistedNodesXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testBlacklistedNodes

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testAMDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInfoXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInvalidUri

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testInfoSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.testAMSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServices.<init>(TestAMWebServices.java:104)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.testGetTaskAttemptIdXMLState

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.<init>(TestAMWebServicesAttempt.java:130)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.testPutTaskAttemptIdState

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.<init>(TestAMWebServicesAttempt.java:130)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.testGetTaskAttemptIdState

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.<init>(TestAMWebServicesAttempt.java:130)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.testPutTaskAttemptIdXMLState

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt.<init>(TestAMWebServicesAttempt.java:130)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptsXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdInvalid2

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdInvalid3

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdInvalid

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptId

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdNonExist

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptsDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttempts

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdXMLCounters

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdCounters

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptsSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdBogus

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.testTaskAttemptIdSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts.<init>(TestAMWebServicesAttempts.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.testJobConf

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.<init>(TestAMWebServicesJobConf.java:151)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.testJobConfXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.<init>(TestAMWebServicesJobConf.java:151)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.testJobConfSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.<init>(TestAMWebServicesJobConf.java:151)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.testJobConfDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf.<init>(TestAMWebServicesJobConf.java:151)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobCountersXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobAttemptsXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobsXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobsDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobsSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobId

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobs

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdInvalidXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdInvalidBogus

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobAttemptsSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobCountersSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobCounters

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdInvalid

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobAttempts

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdInvalidDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobCountersDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobIdNonExist

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.testJobAttemptsDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs.<init>(TestAMWebServicesJobs.java:116)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdCounters

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdBogus

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testJobTaskCountersXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksQueryReduce

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasks

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksQueryMap

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdInvalid2

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdInvalid3

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdCountersSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdNonExist

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdInvalid

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdXML

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskIdCountersDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksQueryInvalid

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksDefault

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTasksSlash

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.testTaskId

Error Message:
java.net.BindException: Address already in use

Stack Trace:
com.sun.jersey.test.framework.spi.container.TestContainerException: java.net.BindException: Address already in use
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:413)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:384)
	at org.glassfish.grizzly.nio.transport.TCPNIOTransport.bind(TCPNIOTransport.java:375)
	at org.glassfish.grizzly.http.server.NetworkListener.start(NetworkListener.java:549)
	at org.glassfish.grizzly.http.server.HttpServer.start(HttpServer.java:255)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:326)
	at com.sun.jersey.api.container.grizzly2.GrizzlyServerFactory.createHttpServer(GrizzlyServerFactory.java:343)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.instantiateGrizzlyWebServer(GrizzlyWebTestContainerFactory.java:219)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:129)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory$GrizzlyWebTestContainer.<init>(GrizzlyWebTestContainerFactory.java:86)
	at com.sun.jersey.test.framework.spi.container.grizzly2.web.GrizzlyWebTestContainerFactory.create(GrizzlyWebTestContainerFactory.java:79)
	at com.sun.jersey.test.framework.JerseyTest.getContainer(JerseyTest.java:342)
	at com.sun.jersey.test.framework.JerseyTest.<init>(JerseyTest.java:217)
	at org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesTasks.<init>(TestAMWebServicesTasks.java:111)



Hadoop-Mapreduce-trunk-Java8 - Build # 1272 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1272/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9564 lines...]
Running org.apache.hadoop.mapred.TestJavaSerialization
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.952 sec - in org.apache.hadoop.mapred.TestJavaSerialization
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.615 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.ipc.TestMRCJCSocketFactory
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.14 sec - in org.apache.hadoop.ipc.TestMRCJCSocketFactory

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.103 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.546 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.450 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:03 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:11 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:44 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:03 h
[INFO] Finished at: 2016-04-14T02:39:25+00:00
[INFO] Final Memory: 34M/160M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1271 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1271/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9587 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 71.386 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.505 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.443 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestMiniMRWithDFSWithDistinctUsers.setUp:78 Test resulted in an unexpected exit

Tests in error: 
  TestMiniMRWithDFSWithDistinctUsers.tearDown:104 » NoClassDefFound org/apache/h...

Tests run: 531, Failures: 5, Errors: 1, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.033 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:47 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.443 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.391 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:52 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:10 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:52 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:12 h
[INFO] Finished at: 2016-04-13T20:05:46+00:00
[INFO] Final Memory: 37M/154M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There was a timeout or other error in the fork -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRWithDFSWithDistinctUsers.testMultipleSpills

Error Message:
org/apache/hadoop/hdfs/server/namenode/JournalSet$5

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/hdfs/server/namenode/JournalSet$5
	at org.apache.hadoop.hdfs.server.namenode.JournalSet.close(JournalSet.java:243)
	at org.apache.hadoop.hdfs.server.namenode.FSEditLog.close(FSEditLog.java:375)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.stopActiveServices(FSNamesystem.java:1202)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.close(FSNamesystem.java:1558)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.stopCommonServices(NameNode.java:790)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.stop(NameNode.java:969)
	at org.apache.hadoop.hdfs.MiniDFSCluster.stopAndJoinNameNode(MiniDFSCluster.java:1965)
	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1911)
	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1882)
	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1875)
	at org.apache.hadoop.mapred.TestMiniMRWithDFSWithDistinctUsers.tearDown(TestMiniMRWithDFSWithDistinctUsers.java:104)


FAILED:  org.apache.hadoop.mapred.TestMiniMRWithDFSWithDistinctUsers.testDistinctUsers

Error Message:
Test resulted in an unexpected exit

Stack Trace:
java.lang.AssertionError: Test resulted in an unexpected exit
	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1895)
	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1882)
	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1875)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:854)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:482)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:441)
	at org.apache.hadoop.mapred.TestMiniMRWithDFSWithDistinctUsers.setUp(TestMiniMRWithDFSWithDistinctUsers.java:78)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1270 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1270/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9565 lines...]
Running org.apache.hadoop.mapreduce.v2.TestMRAMWithNonNormalizedCapabilities
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 59.301 sec - in org.apache.hadoop.mapreduce.v2.TestMRAMWithNonNormalizedCapabilities
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestValueIterReset
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.243 sec - in org.apache.hadoop.mapreduce.TestValueIterReset
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapReduceLazyOutput
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 164.002 sec - in org.apache.hadoop.mapreduce.TestMapReduceLazyOutput

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  4.926 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:17 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 33.403 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  8.279 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [13:22 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [07:18 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:10 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:34 h
[INFO] Finished at: 2016-04-13T12:56:46+00:00
[INFO] Final Memory: 34M/151M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1269 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1269/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9386 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.TestFileSystem
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.551 sec - in org.apache.hadoop.fs.TestFileSystem
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.fs.TestJHLA
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.189 sec - in org.apache.hadoop.fs.TestJHLA
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.559 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestLargeSort

Results :

Tests in error: 
  TestMiniMRClasspath.testClassPath:185 » NoClassDefFound org/apache/hadoop/serv...
  TestMiniMRClasspath.testExternalWritable:207 » YarnRuntime could not cleanup t...

Tests run: 296, Failures: 0, Errors: 2, Skipped: 8

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.529 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:39 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.692 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.926 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:30 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:17 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:10 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:30 h
[INFO] Finished at: 2016-04-13T02:11:13+00:00
[INFO] Final Memory: 40M/158M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter7409155024943407665.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire8007653135475922770tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_1586024897625636958255tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRClasspath.testClassPath

Error Message:
org/apache/hadoop/service/ServiceOperations

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/service/ServiceOperations
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.service.CompositeService.stop(CompositeService.java:157)
	at org.apache.hadoop.service.CompositeService.serviceStop(CompositeService.java:131)
	at org.apache.hadoop.service.AbstractService.stop(AbstractService.java:221)
	at org.apache.hadoop.mapred.MiniMRYarnClusterAdapter.stop(MiniMRYarnClusterAdapter.java:55)
	at org.apache.hadoop.mapred.MiniMRCluster.shutdown(MiniMRCluster.java:267)
	at org.apache.hadoop.mapred.TestMiniMRClasspath.testClassPath(TestMiniMRClasspath.java:185)


FAILED:  org.apache.hadoop.mapred.TestMiniMRClasspath.testExternalWritable

Error Message:
could not cleanup test dir: org.apache.hadoop.fs.UnsupportedFileSystemException: fs.AbstractFileSystem.file.impl=null: No AbstractFileSystem configured for scheme: file

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: could not cleanup test dir: org.apache.hadoop.fs.UnsupportedFileSystemException: fs.AbstractFileSystem.file.impl=null: No AbstractFileSystem configured for scheme: file
	at org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:161)
	at org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:250)
	at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:332)
	at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:329)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
	at org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:329)
	at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:446)
	at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:423)
	at org.apache.hadoop.fs.FileContext.getLocalFSFileContext(FileContext.java:409)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.<init>(MiniYARNCluster.java:161)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.<init>(MiniMRYarnCluster.java:79)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.<init>(MiniMRYarnCluster.java:75)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:73)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:159)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:152)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:145)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:138)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:133)
	at org.apache.hadoop.mapred.TestMiniMRClasspath.testExternalWritable(TestMiniMRClasspath.java:207)



Hadoop-Mapreduce-trunk-Java8 - Build # 1268 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1268/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8819 lines...]
Running org.apache.hadoop.mapreduce.v2.app.TestFetchFailure
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 24.586 sec - in org.apache.hadoop.mapreduce.v2.app.TestFetchFailure
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.66 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 32.832 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestEvents
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.263 sec - in org.apache.hadoop.mapreduce.jobhistory.TestEvents

Results :

Failed tests: 
  TestKill.testKillJob:84 Task state not correct expected:<KILLED> but was:<NEW>

Tests run: 340, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  5.256 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:55 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 42.133 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  8.773 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [15:14 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 19:07 min
[INFO] Finished at: 2016-04-12T22:08:59+00:00
[INFO] Final Memory: 35M/216M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob

Error Message:
Task state not correct expected:<KILLED> but was:<NEW>

Stack Trace:
java.lang.AssertionError: Task state not correct expected:<KILLED> but was:<NEW>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob(TestKill.java:84)



Hadoop-Mapreduce-trunk-Java8 - Build # 1267 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1267/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9573 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.594 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.542 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.995 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 5, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.586 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:39 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 25.148 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.375 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:01 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:08 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:47 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:06 h
[INFO] Finished at: 2016-04-12T20:26:26+00:00
[INFO] Final Memory: 34M/161M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
5 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1266 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1266/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8819 lines...]
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.177 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs
Tests run: 21, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.12 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.87 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapreduceConfigFields
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.11 sec - in org.apache.hadoop.mapreduce.TestMapreduceConfigFields

Results :

Failed tests: 
  TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned:186->testTaskAttemptAssignedKilledHistory:399 No Ta Started JH Event

Tests run: 340, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  5.555 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:32 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 39.004 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [ 10.133 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [13:31 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 17:01 min
[INFO] Finished at: 2016-04-12T14:10:52+00:00
[INFO] Final Memory: 35M/207M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned

Error Message:
No Ta Started JH Event

Stack Trace:
java.lang.AssertionError: No Ta Started JH Event
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testTaskAttemptAssignedKilledHistory(TestTaskAttempt.java:399)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned(TestTaskAttempt.java:186)



Hadoop-Mapreduce-trunk-Java8 - Build # 1265 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1265/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9604 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.582 sec - in org.apache.hadoop.util.TestMRCJCRunJar
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.36 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests in error: 
  TestMRTimelineEventHandling.testMapreduceJobTimelineServiceEnabled:172 » IO Jo...
  TestClusterMRNotification>NotificationTestCase.testMR:198 » IO Job didn't fini...

Tests run: 533, Failures: 4, Errors: 2, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  5.380 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [03:07 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 43.986 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  9.192 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [15:25 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [08:33 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:24 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:52 h
[INFO] Finished at: 2016-04-12T10:15:42+00:00
[INFO] Final Memory: 34M/131M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapred.TestClusterMRNotification.testMR

Error Message:
Job didn't finish in 30 seconds

Stack Trace:
java.io.IOException: Job didn't finish in 30 seconds
	at org.apache.hadoop.mapred.UtilsForTests.runJobFail(UtilsForTests.java:647)
	at org.apache.hadoop.mapred.NotificationTestCase.testMR(NotificationTestCase.java:198)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.mapred.TestMRTimelineEventHandling.testMapreduceJobTimelineServiceEnabled

Error Message:
Job didn't finish in 30 seconds

Stack Trace:
java.io.IOException: Job didn't finish in 30 seconds
	at org.apache.hadoop.mapred.UtilsForTests.runJobSucceed(UtilsForTests.java:622)
	at org.apache.hadoop.mapred.TestMRTimelineEventHandling.testMapreduceJobTimelineServiceEnabled(TestMRTimelineEventHandling.java:172)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1264 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1264/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8473 lines...]
Running org.apache.hadoop.mapred.TestJobAclsManager
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.881 sec - in org.apache.hadoop.mapred.TestJobAclsManager
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestLineRecordReader
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.759 sec - in org.apache.hadoop.mapred.TestLineRecordReader
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestClock
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.075 sec - in org.apache.hadoop.mapred.TestClock
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.087 sec - in org.apache.hadoop.mapred.TestJobQueueClient

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 242, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.207 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:39 min
[INFO] Finished at: 2016-04-12T06:04:56+00:00
[INFO] Final Memory: 32M/185M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk-Java8 - Build # 1263 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1263/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9564 lines...]
Running org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.103 sec - in org.apache.hadoop.mapreduce.lib.map.TestMultithreadedMapper
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.936 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.515 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null

Tests run: 533, Failures: 4, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.297 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.069 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.195 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:52 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:07 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:43 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:02 h
[INFO] Finished at: 2016-04-11T21:35:56+00:00
[INFO] Final Memory: 34M/126M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
4 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)



Hadoop-Mapreduce-trunk-Java8 - Build # 1262 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1262/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8819 lines...]
Running org.apache.hadoop.mapreduce.v2.app.TestFetchFailure
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.877 sec - in org.apache.hadoop.mapreduce.v2.app.TestFetchFailure
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.725 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 34.325 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestEvents
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.11 sec - in org.apache.hadoop.mapreduce.jobhistory.TestEvents

Results :

Failed tests: 
  TestKill.testKillJob:84 Task state not correct expected:<KILLED> but was:<NEW>

Tests run: 340, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  5.659 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [03:08 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 44.783 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  9.970 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [15:40 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 19:52 min
[INFO] Finished at: 2016-04-11T16:43:08+00:00
[INFO] Final Memory: 36M/207M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob

Error Message:
Task state not correct expected:<KILLED> but was:<NEW>

Stack Trace:
java.lang.AssertionError: Task state not correct expected:<KILLED> but was:<NEW>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob(TestKill.java:84)



Hadoop-Mapreduce-trunk-Java8 - Build # 1261 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1261/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8819 lines...]
Running org.apache.hadoop.mapreduce.TestMapreduceConfigFields
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.63 sec - in org.apache.hadoop.mapreduce.TestMapreduceConfigFields
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.088 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.725 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.268 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor

Results :

Failed tests: 
  TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned:186->testTaskAttemptAssignedKilledHistory:399 No Ta Started JH Event

Tests run: 340, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.785 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:39 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.493 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.289 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [11:09 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13:22 min
[INFO] Finished at: 2016-04-11T06:38:02+00:00
[INFO] Final Memory: 36M/207M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned

Error Message:
No Ta Started JH Event

Stack Trace:
java.lang.AssertionError: No Ta Started JH Event
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testTaskAttemptAssignedKilledHistory(TestTaskAttempt.java:399)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned(TestTaskAttempt.java:186)



Hadoop-Mapreduce-trunk-Java8 - Build # 1260 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1260/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8482 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestFileInputFormat
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.908 sec - in org.apache.hadoop.mapred.TestFileInputFormat
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobEndNotifier
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.2 sec - in org.apache.hadoop.mapred.TestJobEndNotifier
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestOldMethodsJobID
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.256 sec - in org.apache.hadoop.mapred.TestOldMethodsJobID
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.087 sec - in org.apache.hadoop.mapred.TestJobQueueClient
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobConf
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.532 sec - in org.apache.hadoop.mapred.TestJobConf

Results :

Tests run: 236, Failures: 0, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.636 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:42 min
[INFO] Finished at: 2016-04-11T04:21:06+00:00
[INFO] Final Memory: 32M/232M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire/surefirebooter2372701349343270519.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire/surefire7689850057449846089tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire/surefire_143137792919288622542tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed

Hadoop-Mapreduce-trunk-Java8 - Build # 1259 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1259/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8673 lines...]
  +-org.apache.hadoop:hadoop-yarn-server-resourcemanager:3.0.0-20160409.003035-5168
and
+-org.apache.hadoop:hadoop-mapreduce-client-app:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-yarn-server-resourcemanager:3.0.0-20160409.003035-5168
and
+-org.apache.hadoop:hadoop-mapreduce-client-app:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-yarn-server-tests:3.0.0-20160409.003037-4915
    +-org.apache.hadoop:hadoop-yarn-server-resourcemanager:3.0.0-SNAPSHOT

[WARNING] Rule 0: org.apache.maven.plugins.enforcer.DependencyConvergence failed with message:
Failed while enforcing releasability the error(s) are [
Dependency convergence error for org.apache.hadoop:hadoop-yarn-server-resourcemanager:3.0.0-20160409.003035-5168 paths to dependency are:
+-org.apache.hadoop:hadoop-mapreduce-client-app:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-yarn-server-resourcemanager:3.0.0-20160409.003035-5168
and
+-org.apache.hadoop:hadoop-mapreduce-client-app:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-yarn-server-resourcemanager:3.0.0-20160409.003035-5168
and
+-org.apache.hadoop:hadoop-mapreduce-client-app:3.0.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-yarn-server-tests:3.0.0-20160409.003037-4915
    +-org.apache.hadoop:hadoop-yarn-server-resourcemanager:3.0.0-SNAPSHOT
]
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [ 10.197 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:45 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 37.278 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.699 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [11:21 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 14:00 min
[INFO] Finished at: 2016-04-10T17:30:24+00:00
[INFO] Final Memory: 59M/215M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-enforcer-plugin:1.3.1:enforce (depcheck) on project hadoop-mapreduce-client-app: Some Enforcer rules have failed. Look above for specific messages explaining why the rule failed. -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed

Hadoop-Mapreduce-trunk-Java8 - Build # 1258 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1258/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 4939 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.233 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.603 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests run: 533, Failures: 6, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.390 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:50 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 27.647 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.743 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:59 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:15 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:47 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:07 h
[INFO] Finished at: 2016-04-09T21:46:07+00:00
[INFO] Final Memory: 38M/305M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk-Java8 - Build # 1257 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1257/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 4945 lines...]
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.549 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.548 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.366 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 532, Failures: 7, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.241 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:43 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 26.737 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.643 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:54 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:14 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:00 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:20 h
[INFO] Finished at: 2016-04-09T19:36:36+00:00
[INFO] Final Memory: 41M/275M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There was a timeout or other error in the fork -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
7 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk-Java8 - Build # 1256 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1256/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 4949 lines...]
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.54 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.421 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 7, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.346 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:44 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 26.916 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.679 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:08 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:16 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:44 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:04 h
[INFO] Finished at: 2016-04-09T13:21:07+00:00
[INFO] Final Memory: 38M/287M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
7 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk-Java8 - Build # 1255 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1255/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8204 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapCollection
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.72 sec - in org.apache.hadoop.mapreduce.TestMapCollection
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.859 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests run: 533, Failures: 6, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  5.179 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:56 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 31.162 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  6.386 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:51 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:36 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:43 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:04 h
[INFO] Finished at: 2016-04-09T02:29:37+00:00
[INFO] Final Memory: 39M/270M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
6 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk-Java8 - Build # 1254 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1254/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 4948 lines...]
Running org.apache.hadoop.util.TestMRCJCRunJar
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.233 sec - in org.apache.hadoop.util.TestMRCJCRunJar
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.506 sec - in org.apache.hadoop.hdfs.TestNNBench

Results :

Failed tests: 
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests run: 533, Failures: 7, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.438 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:55 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 31.840 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  6.609 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:45 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:32 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:46 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:07 h
[INFO] Finished at: 2016-04-08T23:28:50+00:00
[INFO] Final Memory: 38M/216M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
7 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk-Java8 - Build # 1253 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1253/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 4168 lines...]
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.997 sec - in org.apache.hadoop.mapreduce.v2.app.TestFetchFailure
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.682 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 33.472 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestEvents
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.348 sec - in org.apache.hadoop.mapreduce.jobhistory.TestEvents

Results :

Failed tests: 
  TestKill.testKillJob:84 Task state not correct expected:<KILLED> but was:<NEW>
  TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned:186->testTaskAttemptAssignedKilledHistory:399 No Ta Started JH Event

Tests run: 340, Failures: 2, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  5.178 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [03:22 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 52.363 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  9.346 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [15:36 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 20:08 min
[INFO] Finished at: 2016-04-08T19:47:07+00:00
[INFO] Final Memory: 56M/346M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob

Error Message:
Task state not correct expected:<KILLED> but was:<NEW>

Stack Trace:
java.lang.AssertionError: Task state not correct expected:<KILLED> but was:<NEW>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob(TestKill.java:84)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned

Error Message:
No Ta Started JH Event

Stack Trace:
java.lang.AssertionError: No Ta Started JH Event
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testTaskAttemptAssignedKilledHistory(TestTaskAttempt.java:399)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned(TestTaskAttempt.java:186)



Hadoop-Mapreduce-trunk-Java8 - Build # 1252 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1252/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 4168 lines...]
Running org.apache.hadoop.mapreduce.v2.app.TestMRApp
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.412 sec - in org.apache.hadoop.mapreduce.v2.app.TestMRApp
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.265 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.045 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.684 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher

Results :

Tests in error: 
  TestJobHistoryEventHandler.cleanUpClass:98 » NoClassDefFound org/apache/hadoop...

Tests run: 341, Failures: 0, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.299 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:45 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 26.898 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.767 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [11:04 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13:24 min
[INFO] Finished at: 2016-04-08T16:29:50+00:00
[INFO] Final Memory: 55M/293M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler.org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler

Error Message:
org/apache/hadoop/util/IntrusiveCollection$IntrusiveIterator

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/util/IntrusiveCollection$IntrusiveIterator
	at org.apache.hadoop.util.IntrusiveCollection.iterator(IntrusiveCollection.java:213)
	at org.apache.hadoop.util.IntrusiveCollection.clear(IntrusiveCollection.java:368)
	at org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.clearPendingCachingCommands(DatanodeManager.java:1658)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.stopActiveServices(FSNamesystem.java:1214)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.close(FSNamesystem.java:1559)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.stopCommonServices(NameNode.java:790)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.stop(NameNode.java:969)
	at org.apache.hadoop.hdfs.MiniDFSCluster.stopAndJoinNameNode(MiniDFSCluster.java:1965)
	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1911)
	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1882)
	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1875)
	at org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler.cleanUpClass(TestJobHistoryEventHandler.java:98)



Hadoop-Mapreduce-trunk-Java8 - Build # 1251 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1251/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 4948 lines...]
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.546 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.508 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 7, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.542 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:47 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 28.058 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.795 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:02 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:13 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:43 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:03 h
[INFO] Finished at: 2016-04-08T14:19:21+00:00
[INFO] Final Memory: 37M/280M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
7 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk-Java8 - Build # 1250 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1250/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8477 lines...]
Running org.apache.hadoop.mapreduce.lib.output.TestFileOutputCommitter
Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.96 sec - in org.apache.hadoop.mapreduce.lib.output.TestFileOutputCommitter
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.output.TestMapFileOutputFormat
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.145 sec - in org.apache.hadoop.mapreduce.lib.output.TestMapFileOutputFormat
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.output.TestPreemptableFileOutputCommitter
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.935 sec - in org.apache.hadoop.mapreduce.lib.output.TestPreemptableFileOutputCommitter
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.output.TestFileOutputFormat
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.547 sec - in org.apache.hadoop.mapreduce.lib.output.TestFileOutputFormat

Results :

Failed tests: 
  TestJobEndNotifier.testNotificationTimeout:182 null

Tests run: 241, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  8.943 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [04:27 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 04:39 min
[INFO] Finished at: 2016-04-08T07:47:08+00:00
[INFO] Final Memory: 32M/189M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapred.TestJobEndNotifier.testNotificationTimeout

Error Message:
null

Stack Trace:
junit.framework.AssertionFailedError: null
	at junit.framework.Assert.fail(Assert.java:55)
	at junit.framework.Assert.assertTrue(Assert.java:22)
	at junit.framework.Assert.assertTrue(Assert.java:31)
	at junit.framework.TestCase.assertTrue(TestCase.java:201)
	at org.apache.hadoop.mapred.TestJobEndNotifier.testNotificationTimeout(TestJobEndNotifier.java:182)



Hadoop-Mapreduce-trunk-Java8 - Build # 1249 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1249/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8472 lines...]
Running org.apache.hadoop.mapred.TestJobEndNotifier
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.248 sec - in org.apache.hadoop.mapred.TestJobEndNotifier
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestOldMethodsJobID
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.275 sec - in org.apache.hadoop.mapred.TestOldMethodsJobID
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.087 sec - in org.apache.hadoop.mapred.TestJobQueueClient
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobConf
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.521 sec - in org.apache.hadoop.mapred.TestJobConf

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 241, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.236 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:42 min
[INFO] Finished at: 2016-04-08T01:23:01+00:00
[INFO] Final Memory: 32M/185M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk-Java8 - Build # 1248 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1248/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 4164 lines...]
Running org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.571 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.38 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestEvents
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.617 sec - in org.apache.hadoop.mapreduce.jobhistory.TestEvents

Results :

Failed tests: 
  TestRecovery.testCrashed:188 TaskAttempt state is not correct (timedout) expected:<FAILED> but was:<STARTING>

Tests in error: 
  TestRecovery.testSpeculative:1201 NullPointer

Tests run: 340, Failures: 1, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.673 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:47 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 27.310 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.650 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [11:49 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 14:11 min
[INFO] Finished at: 2016-04-07T06:30:38+00:00
[INFO] Final Memory: 55M/349M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.TestRecovery.testCrashed

Error Message:
TaskAttempt state is not correct (timedout) expected:<FAILED> but was:<STARTING>

Stack Trace:
java.lang.AssertionError: TaskAttempt state is not correct (timedout) expected:<FAILED> but was:<STARTING>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.waitForState(MRApp.java:382)
	at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testCrashed(TestRecovery.java:188)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestRecovery.testSpeculative

Error Message:
null

Stack Trace:
java.lang.NullPointerException: null
	at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testSpeculative(TestRecovery.java:1201)



Hadoop-Mapreduce-trunk-Java8 - Build # 1247 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1247/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8726 lines...]
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.226 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs
Tests run: 21, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.48 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.65 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapreduceConfigFields
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.784 sec - in org.apache.hadoop.mapreduce.TestMapreduceConfigFields

Results :

Failed tests: 
  TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned:186->testTaskAttemptAssignedKilledHistory:399 No Ta Started JH Event
  TestJobImpl.testUnusableNodeTransition:629->assertJobState:1012 expected:<SUCCEEDED> but was:<ERROR>

Tests run: 340, Failures: 2, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.544 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:07 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 30.584 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  7.581 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [12:44 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 15:34 min
[INFO] Finished at: 2016-04-07T05:37:05+00:00
[INFO] Final Memory: 36M/234M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition

Error Message:
expected:<SUCCEEDED> but was:<ERROR>

Stack Trace:
java.lang.AssertionError: expected:<SUCCEEDED> but was:<ERROR>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.assertJobState(TestJobImpl.java:1012)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition(TestJobImpl.java:629)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned

Error Message:
No Ta Started JH Event

Stack Trace:
java.lang.AssertionError: No Ta Started JH Event
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testTaskAttemptAssignedKilledHistory(TestTaskAttempt.java:399)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned(TestTaskAttempt.java:186)



Hadoop-Mapreduce-trunk-Java8 - Build # 1246 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1246/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8826 lines...]

Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.025 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.382 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestEvents
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.599 sec - in org.apache.hadoop.mapreduce.jobhistory.TestEvents

Results :

Failed tests: 
  TestJobImpl.testUnusableNodeTransition:629->assertJobState:1012 expected:<SUCCEEDED> but was:<ERROR>
  TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned:186->testTaskAttemptAssignedKilledHistory:399 No Ta Started JH Event

Tests run: 340, Failures: 2, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.391 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 23.994 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.450 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [12:05 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 14:15 min
[INFO] Finished at: 2016-04-07T04:38:35+00:00
[INFO] Final Memory: 35M/193M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition

Error Message:
expected:<SUCCEEDED> but was:<ERROR>

Stack Trace:
java.lang.AssertionError: expected:<SUCCEEDED> but was:<ERROR>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.assertJobState(TestJobImpl.java:1012)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition(TestJobImpl.java:629)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned

Error Message:
No Ta Started JH Event

Stack Trace:
java.lang.AssertionError: No Ta Started JH Event
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testTaskAttemptAssignedKilledHistory(TestTaskAttempt.java:399)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned(TestTaskAttempt.java:186)



Hadoop-Mapreduce-trunk-Java8 - Build # 1245 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1245/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9600 lines...]
Running org.apache.hadoop.util.TestMRCJCRunJar
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.278 sec - in org.apache.hadoop.util.TestMRCJCRunJar
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.708 sec - in org.apache.hadoop.hdfs.TestNNBench

Results :

Failed tests: 
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM.testSleepJob:57->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testSleepJobWithMultipleReducers:64->TestMRJobs.testSleepJob:193->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM.testRandomWriter:84->TestMRJobs.testRandomWriter:539->verifyRandomWriterCounters:90->TestMRJobs.verifyRandomWriterCounters:552 null
  TestUberAM>TestMRJobs.testSleepJobWithRemoteJar:198->TestMRJobs.testSleepJobInternal:240->verifySleepJobCounters:71->TestMRJobs.verifySleepJobCounters:475 null
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests run: 533, Failures: 7, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.962 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.435 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.592 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:58 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:13 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:46 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:06 h
[INFO] Finished at: 2016-04-07T03:47:22+00:00
[INFO] Final Memory: 34M/145M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
7 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithMultipleReducers

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testRandomWriter

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifyRandomWriterCounters(TestMRJobs.java:552)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifyRandomWriterCounters(TestUberAM.java:90)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testSleepJobWithRemoteJar

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.verifySleepJobCounters(TestMRJobs.java:475)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.verifySleepJobCounters(TestUberAM.java:71)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk-Java8 - Build # 1244 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1244/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8819 lines...]
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.205 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.07 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.33 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.692 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher

Results :

Failed tests: 
  TestJobImpl.testUnusableNodeTransition:629->assertJobState:1012 expected:<SUCCEEDED> but was:<ERROR>

Tests run: 340, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.234 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.397 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.306 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [11:43 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13:54 min
[INFO] Finished at: 2016-04-07T00:07:30+00:00
[INFO] Final Memory: 35M/198M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition

Error Message:
expected:<SUCCEEDED> but was:<ERROR>

Stack Trace:
java.lang.AssertionError: expected:<SUCCEEDED> but was:<ERROR>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.assertJobState(TestJobImpl.java:1012)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition(TestJobImpl.java:629)



Hadoop-Mapreduce-trunk-Java8 - Build # 1243 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1243/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9558 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 67.925 sec - in org.apache.hadoop.mapreduce.v2.TestMRJobsWithHistoryService
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.TestMRAMWithNonNormalizedCapabilities
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 60.987 sec - in org.apache.hadoop.mapreduce.v2.TestMRAMWithNonNormalizedCapabilities
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestValueIterReset
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.619 sec - in org.apache.hadoop.mapreduce.TestValueIterReset
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapReduceLazyOutput
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 167.631 sec - in org.apache.hadoop.mapreduce.TestMapReduceLazyOutput

Results :

Failed tests: 
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests run: 533, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  5.302 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:21 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 31.868 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  8.653 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [13:25 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [07:02 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:59 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:23 h
[INFO] Finished at: 2016-04-06T23:44:07+00:00
[INFO] Final Memory: 34M/186M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk-Java8 - Build # 1242 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1242/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 10005 lines...]

Results :

Failed tests: 
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests in error: 
  TestClusterMRNotification>NotificationTestCase.testMR:187 » IO Job didn't fini...
  TestMiniMRBringup.testBringUp:34 » YarnRuntime org.apache.hadoop.yarn.webapp.W...
  TestMRTimelineEventHandling.testMRTimelineEventHandling:136 » NoClassDefFound ...
  TestMerge.testMerge:91 » NoClassDefFound org/apache/hadoop/hdfs/server/namenod...
  TestClusterMapReduceTestCase>ClusterMapReduceTestCase.setUp:56->ClusterMapReduceTestCase.startCluster:87 » NoClassDefFound
  TestClusterMapReduceTestCase>ClusterMapReduceTestCase.setUp:56->ClusterMapReduceTestCase.startCluster:87 » NoClassDefFound
  TestClusterMapReduceTestCase>ClusterMapReduceTestCase.setUp:56->ClusterMapReduceTestCase.startCluster:87 » NoClassDefFound
  TestClusterMapReduceTestCase>ClusterMapReduceTestCase.setUp:56->ClusterMapReduceTestCase.startCluster:87 » NoClassDefFound
  TestMiniMRChildTask.setup:356 NoClassDefFound org/apache/hadoop/yarn/server/Mi...

Tests run: 521, Failures: 2, Errors: 8, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.635 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:45 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 25.983 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.487 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:48 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:30 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:32 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:52 h
[INFO] Finished at: 2016-04-06T20:44:07+00:00
[INFO] Final Memory: 34M/140M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter7581852704555090576.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire3006425328570174400tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_280764486093055778682tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
10 tests failed.
FAILED:  org.apache.hadoop.mapred.TestClusterMRNotification.testMR

Error Message:
Job didn't finish in 30 seconds

Stack Trace:
java.io.IOException: Job didn't finish in 30 seconds
	at org.apache.hadoop.mapred.UtilsForTests.runJobKill(UtilsForTests.java:672)
	at org.apache.hadoop.mapred.NotificationTestCase.testMR(NotificationTestCase.java:187)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.mapred.TestClusterMapReduceTestCase.testMapReduceRestarting

Error Message:
org/apache/hadoop/yarn/exceptions/YarnRuntimeException

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/exceptions/YarnRuntimeException
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:58)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase$ConfigurableMiniMRCluster.<init>(ClusterMapReduceTestCase.java:101)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.startCluster(ClusterMapReduceTestCase.java:87)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.setUp(ClusterMapReduceTestCase.java:56)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.yarn.exceptions.YarnRuntimeException
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:58)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase$ConfigurableMiniMRCluster.<init>(ClusterMapReduceTestCase.java:101)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.startCluster(ClusterMapReduceTestCase.java:87)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.setUp(ClusterMapReduceTestCase.java:56)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.mapred.TestClusterMapReduceTestCase.testMRConfig

Error Message:
org/apache/hadoop/yarn/exceptions/YarnRuntimeException

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/exceptions/YarnRuntimeException
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:58)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase$ConfigurableMiniMRCluster.<init>(ClusterMapReduceTestCase.java:101)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.startCluster(ClusterMapReduceTestCase.java:87)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.setUp(ClusterMapReduceTestCase.java:56)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.yarn.exceptions.YarnRuntimeException
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:58)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase$ConfigurableMiniMRCluster.<init>(ClusterMapReduceTestCase.java:101)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.startCluster(ClusterMapReduceTestCase.java:87)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.setUp(ClusterMapReduceTestCase.java:56)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.mapred.TestClusterMapReduceTestCase.testDFSRestart

Error Message:
org/apache/hadoop/yarn/exceptions/YarnRuntimeException

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/exceptions/YarnRuntimeException
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:58)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase$ConfigurableMiniMRCluster.<init>(ClusterMapReduceTestCase.java:101)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.startCluster(ClusterMapReduceTestCase.java:87)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.setUp(ClusterMapReduceTestCase.java:56)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.yarn.exceptions.YarnRuntimeException
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:58)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase$ConfigurableMiniMRCluster.<init>(ClusterMapReduceTestCase.java:101)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.startCluster(ClusterMapReduceTestCase.java:87)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.setUp(ClusterMapReduceTestCase.java:56)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.mapred.TestClusterMapReduceTestCase.testMapReduce

Error Message:
org/apache/hadoop/yarn/exceptions/YarnRuntimeException

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/exceptions/YarnRuntimeException
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:58)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase$ConfigurableMiniMRCluster.<init>(ClusterMapReduceTestCase.java:101)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.startCluster(ClusterMapReduceTestCase.java:87)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.setUp(ClusterMapReduceTestCase.java:56)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.yarn.exceptions.YarnRuntimeException
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:58)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase$ConfigurableMiniMRCluster.<init>(ClusterMapReduceTestCase.java:101)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.startCluster(ClusterMapReduceTestCase.java:87)
	at org.apache.hadoop.mapred.ClusterMapReduceTestCase.setUp(ClusterMapReduceTestCase.java:56)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:24)
	at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
	at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:271)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:70)
	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
	at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.mapred.TestMerge.testMerge

Error Message:
org/apache/hadoop/hdfs/server/namenode/JournalSet$5

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/hdfs/server/namenode/JournalSet$5
	at org.apache.hadoop.hdfs.server.namenode.JournalSet.close(JournalSet.java:243)
	at org.apache.hadoop.hdfs.server.namenode.FSEditLog.close(FSEditLog.java:375)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.stopActiveServices(FSNamesystem.java:1203)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.close(FSNamesystem.java:1559)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.stopCommonServices(NameNode.java:790)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.stop(NameNode.java:969)
	at org.apache.hadoop.hdfs.MiniDFSCluster.stopAndJoinNameNode(MiniDFSCluster.java:1965)
	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1911)
	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1882)
	at org.apache.hadoop.hdfs.MiniDFSCluster.shutdown(MiniDFSCluster.java:1875)
	at org.apache.hadoop.mapred.TestMerge.testMerge(TestMerge.java:91)


FAILED:  org.apache.hadoop.mapred.TestMiniMRBringup.testBringUp

Error Message:
org.apache.hadoop.yarn.webapp.WebAppException: Error starting http server

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: org.apache.hadoop.yarn.webapp.WebAppException: Error starting http server
	at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:875)
	at org.apache.hadoop.yarn.webapp.WebApps$Builder.start(WebApps.java:348)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startWepApp(ResourceManager.java:1078)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1176)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.startResourceManager(MiniYARNCluster.java:335)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.access$300(MiniYARNCluster.java:112)
	at org.apache.hadoop.yarn.server.MiniYARNCluster$ResourceManagerWrapper.serviceStart(MiniYARNCluster.java:464)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.serviceStart(MiniYARNCluster.java:292)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.serviceStart(MiniMRYarnCluster.java:191)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapred.MiniMRClientClusterFactory.create(MiniMRClientClusterFactory.java:80)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:187)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:175)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:167)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:159)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:152)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:145)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:138)
	at org.apache.hadoop.mapred.MiniMRCluster.<init>(MiniMRCluster.java:133)
	at org.apache.hadoop.mapred.TestMiniMRBringup.testBringUp(TestMiniMRBringup.java:34)


FAILED:  org.apache.hadoop.mapred.TestMiniMRChildTask.org.apache.hadoop.mapred.TestMiniMRChildTask

Error Message:
org/apache/hadoop/yarn/server/MiniYARNCluster$CustomNodeManager

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/server/MiniYARNCluster$CustomNodeManager
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.mapred.TestMiniMRChildTask.setup(TestMiniMRChildTask.java:356)


FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk-Java8 - Build # 1241 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1241/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8820 lines...]
Running org.apache.hadoop.mapreduce.TestMapreduceConfigFields
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.605 sec - in org.apache.hadoop.mapreduce.TestMapreduceConfigFields
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.996 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.756 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.347 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor

Results :

Failed tests: 
  TestJobImpl.testUnusableNodeTransition:629->assertJobState:1012 expected:<SUCCEEDED> but was:<ERROR>

Tests run: 340, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.598 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:47 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 26.956 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.913 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [11:44 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 14:07 min
[INFO] Finished at: 2016-04-06T17:02:43+00:00
[INFO] Final Memory: 35M/205M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition

Error Message:
expected:<SUCCEEDED> but was:<ERROR>

Stack Trace:
java.lang.AssertionError: expected:<SUCCEEDED> but was:<ERROR>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.assertJobState(TestJobImpl.java:1012)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition(TestJobImpl.java:629)



Hadoop-Mapreduce-trunk-Java8 - Build # 1240 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1240/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8821 lines...]
Running org.apache.hadoop.mapreduce.v2.app.TestMRApp
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.461 sec - in org.apache.hadoop.mapreduce.v2.app.TestMRApp
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.293 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.066 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.721 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher

Results :

Failed tests: 
  TestJobImpl.testUnusableNodeTransition:629->assertJobState:1012 expected:<SUCCEEDED> but was:<ERROR>

Tests run: 340, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.568 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:41 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.809 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.428 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [11:07 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13:22 min
[INFO] Finished at: 2016-04-06T08:32:57+00:00
[INFO] Final Memory: 35M/199M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition

Error Message:
expected:<SUCCEEDED> but was:<ERROR>

Stack Trace:
java.lang.AssertionError: expected:<SUCCEEDED> but was:<ERROR>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.assertJobState(TestJobImpl.java:1012)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition(TestJobImpl.java:629)



Hadoop-Mapreduce-trunk-Java8 - Build # 1239 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1239/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8819 lines...]
Running org.apache.hadoop.mapreduce.v2.app.TestFetchFailure
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.297 sec - in org.apache.hadoop.mapreduce.v2.app.TestFetchFailure
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.365 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.642 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestEvents
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.626 sec - in org.apache.hadoop.mapreduce.jobhistory.TestEvents

Results :

Failed tests: 
  TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned:177->testTaskAttemptAssignedKilledHistory:388 No Ta Started JH Event

Tests run: 340, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.293 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:42 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.838 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.209 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [11:13 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13:27 min
[INFO] Finished at: 2016-04-06T00:44:37+00:00
[INFO] Final Memory: 35M/200M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned

Error Message:
No Ta Started JH Event

Stack Trace:
java.lang.AssertionError: No Ta Started JH Event
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testTaskAttemptAssignedKilledHistory(TestTaskAttempt.java:388)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned(TestTaskAttempt.java:177)



Hadoop-Mapreduce-trunk-Java8 - Build # 1238 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1238/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9634 lines...]
Running org.apache.hadoop.util.TestMRCJCReflectionUtils
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.521 sec - in org.apache.hadoop.util.TestMRCJCReflectionUtils
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.util.TestMRCJCRunJar
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.228 sec - in org.apache.hadoop.util.TestMRCJCRunJar
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.505 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests in error: 
  TestMRKeyFieldBasedComparator.testBasicUnixComparator:111->testComparator:75 » Runtime

Tests run: 532, Failures: 2, Errors: 1, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.147 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.422 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.242 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:57 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:10 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:49 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:08 h
[INFO] Finished at: 2016-04-05T23:59:38+00:00
[INFO] Final Memory: 34M/129M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter7208504992669182677.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire5922963332708881714tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_2754339423566657532336tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapreduce.lib.partition.TestMRKeyFieldBasedComparator.testBasicUnixComparator

Error Message:
java.util.zip.ZipException: ZIP_Read: error reading zip file

Stack Trace:
java.lang.RuntimeException: java.util.zip.ZipException: ZIP_Read: error reading zip file
	at java.util.zip.ZipFile.read(Native Method)
	at java.util.zip.ZipFile.access$1400(ZipFile.java:61)
	at java.util.zip.ZipFile$ZipFileInputStream.read(ZipFile.java:715)
	at java.util.zip.ZipFile$ZipFileInflaterInputStream.fill(ZipFile.java:420)
	at java.util.zip.InflaterInputStream.read(InflaterInputStream.java:158)
	at java.io.FilterInputStream.read(FilterInputStream.java:133)
	at org.apache.xerces.impl.XMLEntityManager$RewindableInputStream.read(Unknown Source)
	at org.apache.xerces.impl.io.UTF8Reader.read(Unknown Source)
	at org.apache.xerces.impl.XMLEntityScanner.load(Unknown Source)
	at org.apache.xerces.impl.XMLEntityScanner.scanContent(Unknown Source)
	at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scanContent(Unknown Source)
	at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl$FragmentContentDispatcher.dispatch(Unknown Source)
	at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scanDocument(Unknown Source)
	at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
	at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
	at org.apache.xerces.parsers.XMLParser.parse(Unknown Source)
	at org.apache.xerces.parsers.DOMParser.parse(Unknown Source)
	at org.apache.xerces.jaxp.DocumentBuilderImpl.parse(Unknown Source)
	at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:150)
	at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2568)
	at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2556)
	at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2627)
	at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2580)
	at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2484)
	at org.apache.hadoop.conf.Configuration.get(Configuration.java:1045)
	at org.apache.hadoop.conf.Configuration.getTrimmed(Configuration.java:1095)
	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2268)
	at org.apache.hadoop.fs.AbstractFileSystem.createFileSystem(AbstractFileSystem.java:159)
	at org.apache.hadoop.fs.AbstractFileSystem.get(AbstractFileSystem.java:250)
	at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:332)
	at org.apache.hadoop.fs.FileContext$2.run(FileContext.java:329)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
	at org.apache.hadoop.fs.FileContext.getAbstractFileSystem(FileContext.java:329)
	at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:446)
	at org.apache.hadoop.fs.FileContext.getFileContext(FileContext.java:423)
	at org.apache.hadoop.fs.FileContext.getLocalFSFileContext(FileContext.java:409)
	at org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:115)
	at org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:172)
	at org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:788)
	at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:244)
	at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1341)
	at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1338)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
	at org.apache.hadoop.mapreduce.Job.submit(Job.java:1338)
	at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1359)
	at org.apache.hadoop.mapreduce.lib.partition.TestMRKeyFieldBasedComparator.testComparator(TestMRKeyFieldBasedComparator.java:75)
	at org.apache.hadoop.mapreduce.lib.partition.TestMRKeyFieldBasedComparator.testBasicUnixComparator(TestMRKeyFieldBasedComparator.java:111)


FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk-Java8 - Build # 1237 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1237/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9555 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.464 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.util.TestMRCJCReflectionUtils
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.531 sec - in org.apache.hadoop.util.TestMRCJCReflectionUtils
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.util.TestMRCJCRunJar
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.233 sec - in org.apache.hadoop.util.TestMRCJCRunJar
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.464 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests run: 533, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.315 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:40 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.736 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.268 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:00 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:10 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:48 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:07 h
[INFO] Finished at: 2016-04-05T21:46:07+00:00
[INFO] Final Memory: 34M/153M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk-Java8 - Build # 1236 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1236/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8829 lines...]
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.144 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs
Tests run: 21, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.441 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.004 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapreduceConfigFields
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.758 sec - in org.apache.hadoop.mapreduce.TestMapreduceConfigFields

Results :

Failed tests: 
  TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned:177->testTaskAttemptAssignedKilledHistory:388 No Ta Started JH Event
  TestJobImpl.testUnusableNodeTransition:629->assertJobState:1012 expected:<SUCCEEDED> but was:<ERROR>

Tests run: 340, Failures: 2, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  4.476 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:08 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 30.163 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  6.874 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [12:44 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 15:36 min
[INFO] Finished at: 2016-04-05T16:47:18+00:00
[INFO] Final Memory: 35M/241M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition

Error Message:
expected:<SUCCEEDED> but was:<ERROR>

Stack Trace:
java.lang.AssertionError: expected:<SUCCEEDED> but was:<ERROR>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.assertJobState(TestJobImpl.java:1012)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition(TestJobImpl.java:629)


FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned

Error Message:
No Ta Started JH Event

Stack Trace:
java.lang.AssertionError: No Ta Started JH Event
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testTaskAttemptAssignedKilledHistory(TestTaskAttempt.java:388)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned(TestTaskAttempt.java:177)



Hadoop-Mapreduce-trunk-Java8 - Build # 1235 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1235/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9566 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.493 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.528 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.356 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 3, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.322 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.614 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.328 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:13 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:10 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:42 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:01 h
[INFO] Finished at: 2016-04-05T16:21:19+00:00
[INFO] Final Memory: 34M/128M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk-Java8 - Build # 1234 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1234/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9564 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.467 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.567 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.366 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 3, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.162 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 23.931 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.383 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:54 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:07 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:49 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:08 h
[INFO] Finished at: 2016-04-05T13:38:49+00:00
[INFO] Final Memory: 34M/124M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk-Java8 - Build # 1233 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1233/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9563 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.371 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.53 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.858 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 3, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.364 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:45 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.832 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.543 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:33 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:27 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:43 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:04 h
[INFO] Finished at: 2016-04-05T11:23:55+00:00
[INFO] Final Memory: 34M/155M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk-Java8 - Build # 1232 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1232/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8818 lines...]
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.394 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs
Tests run: 21, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.546 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.624 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapreduceConfigFields
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.755 sec - in org.apache.hadoop.mapreduce.TestMapreduceConfigFields

Results :

Failed tests: 
  TestKill.testKillJob:84 Task state not correct expected:<KILLED> but was:<NEW>

Tests run: 340, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  4.681 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:14 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 33.336 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  8.172 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [12:50 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 15:51 min
[INFO] Finished at: 2016-04-05T02:38:22+00:00
[INFO] Final Memory: 36M/237M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob

Error Message:
Task state not correct expected:<KILLED> but was:<NEW>

Stack Trace:
java.lang.AssertionError: Task state not correct expected:<KILLED> but was:<NEW>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob(TestKill.java:84)



Hadoop-Mapreduce-trunk-Java8 - Build # 1231 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1231/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8473 lines...]
Running org.apache.hadoop.mapreduce.filecache.TestClientDistributedCacheManager
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.177 sec - in org.apache.hadoop.mapreduce.filecache.TestClientDistributedCacheManager
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestShufflePlugin
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.866 sec - in org.apache.hadoop.mapreduce.TestShufflePlugin
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestContextFactory
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.891 sec - in org.apache.hadoop.mapreduce.TestContextFactory
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestHistoryViewerPrinter
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.733 sec - in org.apache.hadoop.mapreduce.jobhistory.TestHistoryViewerPrinter

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 241, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.517 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:41 min
[INFO] Finished at: 2016-04-05T01:21:08+00:00
[INFO] Final Memory: 32M/186M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk-Java8 - Build # 1230 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1230/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8474 lines...]
Running org.apache.hadoop.mapreduce.lib.output.TestFileOutputCommitter
Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.424 sec - in org.apache.hadoop.mapreduce.lib.output.TestFileOutputCommitter
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.output.TestMapFileOutputFormat
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.349 sec - in org.apache.hadoop.mapreduce.lib.output.TestMapFileOutputFormat
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.output.TestPreemptableFileOutputCommitter
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.474 sec - in org.apache.hadoop.mapreduce.lib.output.TestPreemptableFileOutputCommitter
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.output.TestFileOutputFormat
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.128 sec - in org.apache.hadoop.mapreduce.lib.output.TestFileOutputFormat

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 241, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.355 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:41 min
[INFO] Finished at: 2016-04-05T00:37:33+00:00
[INFO] Final Memory: 33M/214M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk-Java8 - Build # 1229 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1229/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9555 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.599 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.util.TestMRCJCReflectionUtils
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.572 sec - in org.apache.hadoop.util.TestMRCJCReflectionUtils
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.util.TestMRCJCRunJar
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.237 sec - in org.apache.hadoop.util.TestMRCJCRunJar
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.515 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests run: 533, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.085 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.507 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.338 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:10 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:10 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:42 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:02 h
[INFO] Finished at: 2016-04-05T00:27:13+00:00
[INFO] Final Memory: 34M/148M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk-Java8 - Build # 1228 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1228/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8473 lines...]
Running org.apache.hadoop.mapred.TestJobEndNotifier
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.89 sec - in org.apache.hadoop.mapred.TestJobEndNotifier
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestOldMethodsJobID
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.261 sec - in org.apache.hadoop.mapred.TestOldMethodsJobID
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.082 sec - in org.apache.hadoop.mapred.TestJobQueueClient
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobConf
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.404 sec - in org.apache.hadoop.mapred.TestJobConf

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 241, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.250 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:35 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:38 min
[INFO] Finished at: 2016-04-04T18:29:42+00:00
[INFO] Final Memory: 32M/202M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk-Java8 - Build # 1227 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1227/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9607 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.031 sec - in org.apache.hadoop.mapred.TestMiniMRDFSCaching
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.495 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.592 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.092 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 528, Failures: 3, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.214 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:35 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.079 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.240 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:55 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:07 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:44 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:03 h
[INFO] Finished at: 2016-04-04T18:22:58+00:00
[INFO] Final Memory: 34M/183M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter309607231163077779.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire661285434513144230tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_2996477158019054191779tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk-Java8 - Build # 1226 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1226/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8817 lines...]
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.839 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobConf
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.985 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.281 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.655 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher

Results :

Failed tests: 
  TestJobImpl.testUnusableNodeTransition:629->assertJobState:1012 expected:<SUCCEEDED> but was:<ERROR>

Tests run: 340, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.353 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:39 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.635 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.407 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [11:07 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13:18 min
[INFO] Finished at: 2016-04-04T13:34:05+00:00
[INFO] Final Memory: 35M/204M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition

Error Message:
expected:<SUCCEEDED> but was:<ERROR>

Stack Trace:
java.lang.AssertionError: expected:<SUCCEEDED> but was:<ERROR>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.assertJobState(TestJobImpl.java:1012)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition(TestJobImpl.java:629)



Hadoop-Mapreduce-trunk-Java8 - Build # 1225 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1225/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8473 lines...]
Running org.apache.hadoop.mapred.TestJobEndNotifier
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.299 sec - in org.apache.hadoop.mapred.TestJobEndNotifier
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestOldMethodsJobID
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.285 sec - in org.apache.hadoop.mapred.TestOldMethodsJobID
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.107 sec - in org.apache.hadoop.mapred.TestJobQueueClient
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobConf
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.842 sec - in org.apache.hadoop.mapred.TestJobConf

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 241, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.597 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:49 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:53 min
[INFO] Finished at: 2016-04-04T09:21:38+00:00
[INFO] Final Memory: 32M/190M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk-Java8 - Build # 1224 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1224/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9562 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.389 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.527 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.438 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 3, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.324 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:35 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 23.663 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.285 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:51 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:06 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:47 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:06 h
[INFO] Finished at: 2016-04-04T05:25:07+00:00
[INFO] Final Memory: 34M/154M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk-Java8 - Build # 1223 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1223/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9564 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.hdfs.TestNNBench
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.551 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.507 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.066 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf

Results :

Failed tests: 
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>

Tests run: 533, Failures: 3, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.149 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 23.846 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.348 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:50 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:08 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:50 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:09 h
[INFO] Finished at: 2016-04-03T18:29:03+00:00
[INFO] Final Memory: 34M/129M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk-Java8 - Build # 1222 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1222/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9373 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestCombineFileInputFormat
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.133 sec - in org.apache.hadoop.mapred.TestCombineFileInputFormat
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestInputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.033 sec - in org.apache.hadoop.mapred.TestInputPath
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.jobcontrol.TestJobControl
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.197 sec - in org.apache.hadoop.mapred.jobcontrol.TestJobControl
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.jobcontrol.TestLocalJobControl

Results :

Failed tests: 
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests run: 332, Failures: 2, Errors: 0, Skipped: 3

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.347 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.186 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.309 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:47 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:07 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:06 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:25 h
[INFO] Finished at: 2016-04-03T09:45:10+00:00
[INFO] Final Memory: 39M/152M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter4577160216081697939.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire3881771950585221463tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_141766802045535935502tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk-Java8 - Build # 1221 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1221/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8475 lines...]
Running org.apache.hadoop.mapreduce.lib.output.TestFileOutputCommitter
Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.503 sec - in org.apache.hadoop.mapreduce.lib.output.TestFileOutputCommitter
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.output.TestMapFileOutputFormat
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.405 sec - in org.apache.hadoop.mapreduce.lib.output.TestMapFileOutputFormat
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.output.TestPreemptableFileOutputCommitter
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.483 sec - in org.apache.hadoop.mapreduce.lib.output.TestPreemptableFileOutputCommitter
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.output.TestFileOutputFormat
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.143 sec - in org.apache.hadoop.mapreduce.lib.output.TestFileOutputFormat

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 241, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.207 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:40 min
[INFO] Finished at: 2016-04-02T00:34:05+00:00
[INFO] Final Memory: 32M/187M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk-Java8 - Build # 1220 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1220/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9555 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.496 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.util.TestMRCJCReflectionUtils
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.494 sec - in org.apache.hadoop.util.TestMRCJCReflectionUtils
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.util.TestMRCJCRunJar
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.241 sec - in org.apache.hadoop.util.TestMRCJCRunJar
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.497 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests run: 533, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.462 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.330 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.426 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:58 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:08 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:44 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:04 h
[INFO] Finished at: 2016-04-02T00:24:10+00:00
[INFO] Final Memory: 34M/153M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk-Java8 - Build # 1219 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1219/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9601 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 72.763 sec - in org.apache.hadoop.mapred.TestReduceFetchFromPartialMem
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 24.215 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.477 sec - in org.apache.hadoop.mapred.TestSequenceFileAsBinaryOutputFormat

Results :

Failed tests: 
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests in error: 
  TestMiniMRClientCluster.testRestart:114 » YarnRuntime org.apache.hadoop.yarn.e...

Tests run: 533, Failures: 2, Errors: 1, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.525 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 23.989 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.255 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:58 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:09 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:42 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:01 h
[INFO] Finished at: 2016-04-01T12:19:54+00:00
[INFO] Final Memory: 34M/134M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRClientCluster.testRestart

Error Message:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.net.BindException: Problem binding to [asf907.gq1.ygridcore.net:60188] java.net.BindException: Address already in use; For more details see:  http://wiki.apache.org/hadoop/BindException

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.net.BindException: Problem binding to [asf907.gq1.ygridcore.net:60188] java.net.BindException: Address already in use; For more details see:  http://wiki.apache.org/hadoop/BindException
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:414)
	at sun.nio.ch.Net.bind(Net.java:406)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at org.apache.hadoop.ipc.Server.bind(Server.java:529)
	at org.apache.hadoop.ipc.Server$Listener.<init>(Server.java:792)
	at org.apache.hadoop.ipc.Server.<init>(Server.java:2591)
	at org.apache.hadoop.ipc.RPC$Server.<init>(RPC.java:958)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server.<init>(ProtobufRpcEngine.java:535)
	at org.apache.hadoop.ipc.ProtobufRpcEngine.getServer(ProtobufRpcEngine.java:510)
	at org.apache.hadoop.ipc.RPC$Builder.build(RPC.java:800)
	at org.apache.hadoop.yarn.factories.impl.pb.RpcServerFactoryPBImpl.createServer(RpcServerFactoryPBImpl.java:173)
	at org.apache.hadoop.yarn.factories.impl.pb.RpcServerFactoryPBImpl.getServer(RpcServerFactoryPBImpl.java:132)
	at org.apache.hadoop.yarn.ipc.HadoopYarnProtoRPC.getServer(HadoopYarnProtoRPC.java:65)
	at org.apache.hadoop.yarn.ipc.YarnRPC.getServer(YarnRPC.java:54)
	at org.apache.hadoop.yarn.server.resourcemanager.ApplicationMasterService.serviceStart(ApplicationMasterService.java:143)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$RMActiveServices.serviceStart(ResourceManager.java:675)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startActiveServices(ResourceManager.java:1097)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1137)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager$1.run(ResourceManager.java:1133)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.transitionToActive(ResourceManager.java:1133)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1173)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.startResourceManager(MiniYARNCluster.java:335)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.access$300(MiniYARNCluster.java:112)
	at org.apache.hadoop.yarn.server.MiniYARNCluster$ResourceManagerWrapper.serviceStart(MiniYARNCluster.java:464)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.serviceStart(MiniYARNCluster.java:292)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.serviceStart(MiniMRYarnCluster.java:191)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapred.MiniMRYarnClusterAdapter.restart(MiniMRYarnClusterAdapter.java:73)
	at org.apache.hadoop.mapred.TestMiniMRClientCluster.testRestart(TestMiniMRClientCluster.java:114)


FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk-Java8 - Build # 1218 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1218/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9555 lines...]
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.908 sec - in org.apache.hadoop.mapred.pipes.TestPipeApplication
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJavaSerialization
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.918 sec - in org.apache.hadoop.mapred.TestJavaSerialization
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.306 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.ipc.TestMRCJCSocketFactory
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.145 sec - in org.apache.hadoop.ipc.TestMRCJCSocketFactory

Results :

Failed tests: 
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests run: 533, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.280 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:38 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.998 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.415 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:59 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:10 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:40 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:00 h
[INFO] Finished at: 2016-04-01T09:45:07+00:00
[INFO] Final Memory: 34M/160M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk-Java8 - Build # 1217 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1217/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8474 lines...]
Running org.apache.hadoop.mapred.TestJobEndNotifier
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.248 sec - in org.apache.hadoop.mapred.TestJobEndNotifier
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestOldMethodsJobID
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.286 sec - in org.apache.hadoop.mapred.TestOldMethodsJobID
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.092 sec - in org.apache.hadoop.mapred.TestJobQueueClient
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobConf
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.579 sec - in org.apache.hadoop.mapred.TestJobConf

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 241, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.719 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:39 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:43 min
[INFO] Finished at: 2016-04-01T05:21:32+00:00
[INFO] Final Memory: 31M/192M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk-Java8 - Build # 1216 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1216/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9072 lines...]
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.356 sec - in org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEntities
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.hs.TestHistoryServerLeveldbStateStoreService
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.213 sec - in org.apache.hadoop.mapreduce.v2.hs.TestHistoryServerLeveldbStateStoreService

Results :

Tests in error: 
  TestHsWebServicesAcls.setup:84->buildHistoryContext:257 NoClassDefFound org/ap...
  TestHsWebServicesAcls.setup:84->buildHistoryContext:257 NoClassDefFound org/ap...
  TestHsWebServicesAcls.setup:84->buildHistoryContext:257 NoClassDefFound org/ap...
  TestHsWebServicesAcls.setup:84->buildHistoryContext:257 NoClassDefFound org/ap...
  TestHsWebServicesAcls.setup:84->buildHistoryContext:257 NoClassDefFound org/ap...
  TestHsWebServicesAcls.setup:84->buildHistoryContext:257 NoClassDefFound org/ap...
  TestHsWebServicesAcls.setup:84->buildHistoryContext:257 NoClassDefFound org/ap...
  TestHsWebServicesAcls.setup:84->buildHistoryContext:257 NoClassDefFound org/ap...
  TestHsWebServicesAcls.setup:84->buildHistoryContext:257 NoClassDefFound org/ap...

Tests run: 182, Failures: 0, Errors: 9, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.240 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:39 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 25.152 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.779 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:46 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. FAILURE [06:01 min]
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 19:01 min
[INFO] Finished at: 2016-03-31T22:48:47+00:00
[INFO] Final Memory: 37M/200M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-hs: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-hs && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-hs/target/surefire/surefirebooter1343723727041836907.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-hs/target/surefire/surefire5719466652863574492tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-hs/target/surefire/surefire_1198833192491747192040tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-hs
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
9 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.testGetJobCountersAcls

Error Message:
org/apache/hadoop/yarn/event/EventHandler

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/event/EventHandler
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.buildHistoryContext(TestHsWebServicesAcls.java:257)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.setup(TestHsWebServicesAcls.java:84)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.testGetJobTaskAttemptsAcls

Error Message:
org/apache/hadoop/yarn/event/EventHandler

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/event/EventHandler
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.buildHistoryContext(TestHsWebServicesAcls.java:257)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.setup(TestHsWebServicesAcls.java:84)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.testGetJobTaskAttemptIdAcls

Error Message:
org/apache/hadoop/yarn/event/EventHandler

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/event/EventHandler
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.buildHistoryContext(TestHsWebServicesAcls.java:257)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.setup(TestHsWebServicesAcls.java:84)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.testGetSingleTaskCountersAcls

Error Message:
org/apache/hadoop/yarn/event/EventHandler

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/event/EventHandler
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.buildHistoryContext(TestHsWebServicesAcls.java:257)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.setup(TestHsWebServicesAcls.java:84)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.testGetJobAcls

Error Message:
org/apache/hadoop/yarn/event/EventHandler

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/event/EventHandler
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.buildHistoryContext(TestHsWebServicesAcls.java:257)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.setup(TestHsWebServicesAcls.java:84)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.testGetJobConfAcls

Error Message:
org/apache/hadoop/yarn/event/EventHandler

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/event/EventHandler
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.buildHistoryContext(TestHsWebServicesAcls.java:257)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.setup(TestHsWebServicesAcls.java:84)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.testGetJobTaskAcls

Error Message:
org/apache/hadoop/yarn/event/EventHandler

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/event/EventHandler
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.buildHistoryContext(TestHsWebServicesAcls.java:257)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.setup(TestHsWebServicesAcls.java:84)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.testGetJobTasksAcls

Error Message:
org/apache/hadoop/yarn/event/EventHandler

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/event/EventHandler
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.buildHistoryContext(TestHsWebServicesAcls.java:257)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.setup(TestHsWebServicesAcls.java:84)


FAILED:  org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.testGetJobTaskAttemptIdCountersAcls

Error Message:
org/apache/hadoop/yarn/event/EventHandler

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/event/EventHandler
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.buildHistoryContext(TestHsWebServicesAcls.java:257)
	at org.apache.hadoop.mapreduce.v2.hs.webapp.TestHsWebServicesAcls.setup(TestHsWebServicesAcls.java:84)



Hadoop-Mapreduce-trunk-Java8 - Build # 1215 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1215/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8470 lines...]
Tests run: 6, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 1.48 sec <<< FAILURE! - in org.apache.hadoop.mapreduce.tools.TestCLI
testGetJob(org.apache.hadoop.mapreduce.tools.TestCLI)  Time elapsed: 0.041 sec  <<< FAILURE!
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)

Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestTaskID
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.332 sec - in org.apache.hadoop.mapreduce.TestTaskID

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 241, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  3.694 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [02:10 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:15 min
[INFO] Finished at: 2016-03-31T21:02:03+00:00
[INFO] Final Memory: 32M/194M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk-Java8 - Build # 1214 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1214/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9590 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.085 sec - in org.apache.hadoop.mapred.TestJavaSerialization
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.429 sec - in org.apache.hadoop.mapred.TestSpecialCharactersInOutputPath
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.ipc.TestMRCJCSocketFactory
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.144 sec - in org.apache.hadoop.ipc.TestMRCJCSocketFactory

Results :

Failed tests: 
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests in error: 
  TestMiniMRClientCluster.testJob:162 » NoClassDefFound org/apache/hadoop/yarn/c...

Tests run: 533, Failures: 2, Errors: 1, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.196 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 23.949 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.330 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:00 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:11 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:42 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:01 h
[INFO] Finished at: 2016-03-31T20:40:46+00:00
[INFO] Final Memory: 34M/176M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMiniMRClientCluster.testJob

Error Message:
org/apache/hadoop/yarn/client/api/YarnClient

Stack Trace:
java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/client/api/YarnClient
	at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at java.lang.ClassLoader.defineClass1(Native Method)
	at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
	at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
	at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
	at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	at org.apache.hadoop.mapred.YARNRunner.<init>(YARNRunner.java:122)
	at org.apache.hadoop.mapred.YarnClientProtocolProvider.create(YarnClientProtocolProvider.java:34)
	at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:111)
	at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:98)
	at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:91)
	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1311)
	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1307)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1742)
	at org.apache.hadoop.mapreduce.Job.connect(Job.java:1307)
	at org.apache.hadoop.mapreduce.Job.submit(Job.java:1335)
	at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1359)
	at org.apache.hadoop.mapred.TestMiniMRClientCluster.testJob(TestMiniMRClientCluster.java:162)


FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk-Java8 - Build # 1213 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1213/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8470 lines...]
Running org.apache.hadoop.mapred.TestJobEndNotifier
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.875 sec - in org.apache.hadoop.mapred.TestJobEndNotifier
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestOldMethodsJobID
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.261 sec - in org.apache.hadoop.mapred.TestOldMethodsJobID
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.093 sec - in org.apache.hadoop.mapred.TestJobQueueClient
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobConf
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.424 sec - in org.apache.hadoop.mapred.TestJobConf

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 241, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.167 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:34 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:38 min
[INFO] Finished at: 2016-03-31T16:20:48+00:00
[INFO] Final Memory: 32M/189M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk-Java8 - Build # 1212 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1212/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8468 lines...]
Running org.apache.hadoop.mapred.TestJobEndNotifier
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.083 sec - in org.apache.hadoop.mapred.TestJobEndNotifier
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestOldMethodsJobID
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.267 sec - in org.apache.hadoop.mapred.TestOldMethodsJobID
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.083 sec - in org.apache.hadoop.mapred.TestJobQueueClient
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobConf
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.428 sec - in org.apache.hadoop.mapred.TestJobConf

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 241, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.254 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:35 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:38 min
[INFO] Finished at: 2016-03-31T15:20:39+00:00
[INFO] Final Memory: 32M/186M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk-Java8 - Build # 1211 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1211/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9551 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.51 sec - in org.apache.hadoop.hdfs.TestNNBench
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.util.TestMRCJCReflectionUtils
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.503 sec - in org.apache.hadoop.util.TestMRCJCReflectionUtils
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.util.TestMRCJCRunJar
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.214 sec - in org.apache.hadoop.util.TestMRCJCRunJar
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.io.TestSequenceFileMergeProgress
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.468 sec - in org.apache.hadoop.io.TestSequenceFileMergeProgress

Results :

Failed tests: 
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests run: 533, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.401 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.250 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.331 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:59 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:10 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:43 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:03 h
[INFO] Finished at: 2016-03-31T11:22:33+00:00
[INFO] Final Memory: 34M/150M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk-Java8 - Build # 1210 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1210/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8815 lines...]
Running org.apache.hadoop.mapreduce.v2.app.TestMRApp
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.314 sec - in org.apache.hadoop.mapreduce.v2.app.TestMRApp
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.265 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.096 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.762 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher

Results :

Failed tests: 
  TestRecovery.testCrashed:188 TaskAttempt state is not correct (timedout) expected:<FAILED> but was:<STARTING>

Tests run: 340, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.330 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.198 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.296 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [11:03 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13:12 min
[INFO] Finished at: 2016-03-31T07:34:17+00:00
[INFO] Final Memory: 35M/204M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.TestRecovery.testCrashed

Error Message:
TaskAttempt state is not correct (timedout) expected:<FAILED> but was:<STARTING>

Stack Trace:
java.lang.AssertionError: TaskAttempt state is not correct (timedout) expected:<FAILED> but was:<STARTING>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.waitForState(MRApp.java:377)
	at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testCrashed(TestRecovery.java:188)



Hadoop-Mapreduce-trunk-Java8 - Build # 1209 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1209/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8824 lines...]
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.464 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempts
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs
Tests run: 21, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.313 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesJobs
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.936 sec - in org.apache.hadoop.mapreduce.v2.app.webapp.TestAMWebServicesAttempt
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapreduceConfigFields
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.75 sec - in org.apache.hadoop.mapreduce.TestMapreduceConfigFields

Results :

Failed tests: 
  TestKill.testKillJob:84 Task state not correct expected:<KILLED> but was:<NEW>
  TestKill.testKillTask:119 Job state is not correct (timedout) expected:<SUCCEEDED> but was:<ERROR>

Tests run: 340, Failures: 2, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  5.571 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:55 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [01:06 min]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [ 16.441 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [15:42 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 20:07 min
[INFO] Finished at: 2016-03-31T00:47:08+00:00
[INFO] Final Memory: 36M/247M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob

Error Message:
Task state not correct expected:<KILLED> but was:<NEW>

Stack Trace:
java.lang.AssertionError: Task state not correct expected:<KILLED> but was:<NEW>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.apache.hadoop.mapreduce.v2.app.TestKill.testKillJob(TestKill.java:84)


FAILED:  org.apache.hadoop.mapreduce.v2.app.TestKill.testKillTask

Error Message:
Job state is not correct (timedout) expected:<SUCCEEDED> but was:<ERROR>

Stack Trace:
java.lang.AssertionError: Job state is not correct (timedout) expected:<SUCCEEDED> but was:<ERROR>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.apache.hadoop.mapreduce.v2.app.MRApp.waitForState(MRApp.java:411)
	at org.apache.hadoop.mapreduce.v2.app.TestKill.testKillTask(TestKill.java:119)



Hadoop-Mapreduce-trunk-Java8 - Build # 1208 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1208/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9550 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 69.118 sec - in org.apache.hadoop.mapreduce.v2.TestMRJobsWithHistoryService
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.TestMRAMWithNonNormalizedCapabilities
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 60.509 sec - in org.apache.hadoop.mapreduce.v2.TestMRAMWithNonNormalizedCapabilities
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestValueIterReset
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.85 sec - in org.apache.hadoop.mapreduce.TestValueIterReset
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.TestMapReduceLazyOutput
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 191.56 sec - in org.apache.hadoop.mapreduce.TestMapReduceLazyOutput

Results :

Failed tests: 
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests run: 533, Failures: 2, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  5.101 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [02:34 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 39.748 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [ 10.436 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [12:24 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [07:05 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  02:15 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:38 h
[INFO] Finished at: 2016-03-31T00:17:30+00:00
[INFO] Final Memory: 34M/164M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk-Java8 - Build # 1207 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1207/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8811 lines...]
Running org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator
Tests run: 26, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 104.781 sec - in org.apache.hadoop.mapreduce.v2.app.rm.TestRMContainerAllocator
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestEvents
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.743 sec - in org.apache.hadoop.mapreduce.jobhistory.TestEvents
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.4 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobSummary
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.365 sec - in org.apache.hadoop.mapreduce.jobhistory.TestJobHistoryEventHandler

Results :

Tests in error: 
  TestRecovery.testSpeculative:1201 NullPointer

Tests run: 340, Failures: 0, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.667 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:43 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 29.128 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  6.758 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [11:25 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13:48 min
[INFO] Finished at: 2016-03-30T20:33:35+00:00
[INFO] Final Memory: 36M/204M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.TestRecovery.testSpeculative

Error Message:
null

Stack Trace:
java.lang.NullPointerException: null
	at org.apache.hadoop.mapreduce.v2.app.TestRecovery.testSpeculative(TestRecovery.java:1201)



Hadoop-Mapreduce-trunk-Java8 - Build # 1206 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1206/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8972 lines...]
	at org.apache.hadoop.mapreduce.v2.hs.TestJobHistory.testRefreshLoadedJobCacheUnSupportedOperation(TestJobHistory.java:482)

Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEvents
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 38.502 sec - in org.apache.hadoop.mapreduce.v2.hs.TestJobHistoryEvents
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.hs.TestHistoryFileManager
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.57 sec - in org.apache.hadoop.mapreduce.v2.hs.TestHistoryFileManager
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.v2.hs.TestJHSDelegationTokenSecretManager
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.603 sec - in org.apache.hadoop.mapreduce.v2.hs.TestJHSDelegationTokenSecretManager

Results :

Tests in error: 
  TestJobHistory.testRefreshLoadedJobCacheUnSupportedOperation:482 » YarnRuntime

Tests run: 207, Failures: 0, Errors: 1, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.332 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.055 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.212 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [10:53 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. FAILURE [06:09 min]
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 19:10 min
[INFO] Finished at: 2016-03-30T07:47:47+00:00
[INFO] Final Memory: 38M/200M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-hs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-hs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-hs
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.hs.TestJobHistory.testRefreshLoadedJobCacheUnSupportedOperation

Error Message:
Failed to intialize existing directories

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: Failed to intialize existing directories
	at org.apache.hadoop.fs.RawLocalFileSystem.listStatus(RawLocalFileSystem.java:461)
	at org.apache.hadoop.fs.DelegateToFileSystem.listStatus(DelegateToFileSystem.java:168)
	at org.apache.hadoop.fs.ChecksumFs.listStatus(ChecksumFs.java:521)
	at org.apache.hadoop.fs.AbstractFileSystem$1.<init>(AbstractFileSystem.java:890)
	at org.apache.hadoop.fs.AbstractFileSystem.listStatusIterator(AbstractFileSystem.java:888)
	at org.apache.hadoop.fs.FileContext$22.next(FileContext.java:1492)
	at org.apache.hadoop.fs.FileContext$22.next(FileContext.java:1487)
	at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90)
	at org.apache.hadoop.fs.FileContext.listStatus(FileContext.java:1487)
	at org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:457)
	at org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:444)
	at org.apache.hadoop.mapreduce.v2.jobhistory.JobHistoryUtils.localGlobber(JobHistoryUtils.java:439)
	at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.findTimestampedDirectories(HistoryFileManager.java:811)
	at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.initExisting(HistoryFileManager.java:705)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistory.serviceInit(JobHistory.java:98)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistory$$EnhancerByMockitoWithCGLIB$$af248edf.CGLIB$serviceInit$19(<generated>)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistory$$EnhancerByMockitoWithCGLIB$$af248edf$$FastClassByMockitoWithCGLIB$$7f98c519.invoke(<generated>)
	at org.mockito.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:216)
	at org.mockito.internal.creation.AbstractMockitoMethodProxy.invokeSuper(AbstractMockitoMethodProxy.java:10)
	at org.mockito.internal.invocation.realmethod.CGLIBProxyRealMethod.invoke(CGLIBProxyRealMethod.java:22)
	at org.mockito.internal.invocation.realmethod.FilteredCGLIBProxyRealMethod.invoke(FilteredCGLIBProxyRealMethod.java:27)
	at org.mockito.internal.invocation.Invocation.callRealMethod(Invocation.java:211)
	at org.mockito.internal.stubbing.answers.CallsRealMethods.answer(CallsRealMethods.java:36)
	at org.mockito.internal.MockHandler.handle(MockHandler.java:99)
	at org.mockito.internal.creation.MethodInterceptorFilter.intercept(MethodInterceptorFilter.java:47)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistory$$EnhancerByMockitoWithCGLIB$$af248edf.serviceInit(<generated>)
	at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistory$$EnhancerByMockitoWithCGLIB$$af248edf.CGLIB$init$41(<generated>)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistory$$EnhancerByMockitoWithCGLIB$$af248edf$$FastClassByMockitoWithCGLIB$$7f98c519.invoke(<generated>)
	at org.mockito.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:216)
	at org.mockito.internal.creation.AbstractMockitoMethodProxy.invokeSuper(AbstractMockitoMethodProxy.java:10)
	at org.mockito.internal.invocation.realmethod.CGLIBProxyRealMethod.invoke(CGLIBProxyRealMethod.java:22)
	at org.mockito.internal.invocation.realmethod.FilteredCGLIBProxyRealMethod.invoke(FilteredCGLIBProxyRealMethod.java:27)
	at org.mockito.internal.invocation.Invocation.callRealMethod(Invocation.java:211)
	at org.mockito.internal.stubbing.answers.CallsRealMethods.answer(CallsRealMethods.java:36)
	at org.mockito.internal.MockHandler.handle(MockHandler.java:99)
	at org.mockito.internal.creation.MethodInterceptorFilter.intercept(MethodInterceptorFilter.java:47)
	at org.apache.hadoop.mapreduce.v2.hs.JobHistory$$EnhancerByMockitoWithCGLIB$$af248edf.init(<generated>)
	at org.apache.hadoop.mapreduce.v2.hs.TestJobHistory.testRefreshLoadedJobCacheUnSupportedOperation(TestJobHistory.java:482)



Hadoop-Mapreduce-trunk-Java8 - Build # 1205 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1205/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8482 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestFileInputFormat
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.824 sec - in org.apache.hadoop.mapred.TestFileInputFormat
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobEndNotifier
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.896 sec - in org.apache.hadoop.mapred.TestJobEndNotifier
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestOldMethodsJobID
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.277 sec - in org.apache.hadoop.mapred.TestOldMethodsJobID
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobQueueClient
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.092 sec - in org.apache.hadoop.mapred.TestJobQueueClient
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestJobConf
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.548 sec - in org.apache.hadoop.mapred.TestJobConf

Results :

Tests run: 225, Failures: 0, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.358 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:40 min
[INFO] Finished at: 2016-03-30T05:20:38+00:00
[INFO] Final Memory: 31M/217M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire/surefirebooter4397120352537627912.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire/surefire2152081378783758569tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire/surefire_77316622558594551843tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
All tests passed

Hadoop-Mapreduce-trunk-Java8 - Build # 1204 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1204/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8469 lines...]
Running org.apache.hadoop.mapreduce.lib.input.TestLineRecordReader
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.157 sec - in org.apache.hadoop.mapreduce.lib.input.TestLineRecordReader
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.input.TestCombineFileRecordReader
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.265 sec - in org.apache.hadoop.mapreduce.lib.input.TestCombineFileRecordReader
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.lib.partition.TestRehashPartitioner
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.219 sec - in org.apache.hadoop.mapreduce.lib.partition.TestRehashPartitioner
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapreduce.jobhistory.TestHistoryViewerPrinter
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.672 sec - in org.apache.hadoop.mapreduce.jobhistory.TestHistoryViewerPrinter

Results :

Failed tests: 
  TestCLI.testGetJob:181 null

Tests run: 241, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.197 s]
[INFO] Apache Hadoop MapReduce Core ...................... FAILURE [01:36 min]
[INFO] Apache Hadoop MapReduce Common .................... SKIPPED
[INFO] Apache Hadoop MapReduce Shuffle ................... SKIPPED
[INFO] Apache Hadoop MapReduce App ....................... SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:40 min
[INFO] Finished at: 2016-03-30T01:12:53+00:00
[INFO] Final Memory: 31M/214M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-core: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-core/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-core
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:86)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.junit.Assert.assertTrue(Assert.java:52)
	at org.apache.hadoop.mapreduce.tools.TestCLI.testGetJob(TestCLI.java:181)



Hadoop-Mapreduce-trunk-Java8 - Build # 1203 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1203/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9580 lines...]
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.265 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.util.TestMRCJCReflectionUtils
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.523 sec - in org.apache.hadoop.util.TestMRCJCReflectionUtils
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.util.TestMRCJCRunJar
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.402 sec - in org.apache.hadoop.util.TestMRCJCRunJar

Results :

Failed tests: 
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests in error: 
  TestUberAM.setup:45->TestMRJobs.setup:165 » YarnRuntime org.apache.hadoop.yarn...

Tests run: 518, Failures: 2, Errors: 1, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.728 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:39 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.876 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  6.050 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:05 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:14 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:46 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:06 h
[INFO] Finished at: 2016-03-30T00:53:26+00:00
[INFO] Final Memory: 34M/139M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient && /home/jenkins/tools/java/jdk1.8.0/jre/bin/java -Xmx2048m -XX:MaxPermSize=768m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefirebooter5627580275638559067.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire774170787237343470tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire/surefire_2494420547217391862788tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.org.apache.hadoop.mapreduce.v2.TestUberAM

Error Message:
org.apache.hadoop.yarn.webapp.WebAppException: Error starting http server

Stack Trace:
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: org.apache.hadoop.yarn.webapp.WebAppException: Error starting http server
	at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:875)
	at org.apache.hadoop.yarn.webapp.WebApps$Builder.start(WebApps.java:348)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.startWepApp(ResourceManager.java:1078)
	at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.serviceStart(ResourceManager.java:1176)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.startResourceManager(MiniYARNCluster.java:335)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.access$300(MiniYARNCluster.java:112)
	at org.apache.hadoop.yarn.server.MiniYARNCluster$ResourceManagerWrapper.serviceStart(MiniYARNCluster.java:464)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.service.CompositeService.serviceStart(CompositeService.java:120)
	at org.apache.hadoop.yarn.server.MiniYARNCluster.serviceStart(MiniYARNCluster.java:292)
	at org.apache.hadoop.mapreduce.v2.MiniMRYarnCluster.serviceStart(MiniMRYarnCluster.java:191)
	at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.setup(TestMRJobs.java:165)
	at org.apache.hadoop.mapreduce.v2.TestUberAM.setup(TestUberAM.java:45)


FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)



Hadoop-Mapreduce-trunk-Java8 - Build # 1202 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1202/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 9557 lines...]
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.conf.TestNoDefaultsJobConf
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.586 sec - in org.apache.hadoop.conf.TestNoDefaultsJobConf
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.util.TestMRCJCReflectionUtils
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.521 sec - in org.apache.hadoop.util.TestMRCJCReflectionUtils
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.util.TestMRCJCRunJar
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.236 sec - in org.apache.hadoop.util.TestMRCJCRunJar

Results :

Failed tests: 
  TestMRCJCFileOutputCommitter.testAbort:153 Output directory not empty expected:<0> but was:<4>
  TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>
  TestUberAM>TestMRJobs.testJobWithChangePriority:276 expected:<DEFAULT> but was:<HIGH>

Tests run: 531, Failures: 3, Errors: 0, Skipped: 11

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.219 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:37 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 24.743 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  5.322 s]
[INFO] Apache Hadoop MapReduce App ....................... SUCCESS [11:05 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SUCCESS [06:12 min]
[INFO] Apache Hadoop MapReduce JobClient ................. FAILURE [  01:43 h]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:02 h
[INFO] Finished at: 2016-03-29T22:39:03+00:00
[INFO] Final Memory: 34M/160M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-jobclient: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-jobclient/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-jobclient
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
FAILED:  org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort

Error Message:
Output directory not empty expected:<0> but was:<4>

Stack Trace:
java.lang.AssertionError: Output directory not empty expected:<0> but was:<4>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.apache.hadoop.mapred.TestMRCJCFileOutputCommitter.testAbort(TestMRCJCFileOutputCommitter.java:153)


FAILED:  org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)


FAILED:  org.apache.hadoop.mapreduce.v2.TestUberAM.testJobWithChangePriority

Error Message:
expected:<DEFAULT> but was:<HIGH>

Stack Trace:
java.lang.AssertionError: expected:<DEFAULT> but was:<HIGH>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.TestMRJobs.testJobWithChangePriority(TestMRJobs.java:276)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:483)
	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:47)
	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:44)
	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
	at org.junit.internal.runners.statements.FailOnTimeout$StatementThread.run(FailOnTimeout.java:74)



Hadoop-Mapreduce-trunk-Java8 - Build # 1201 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1201/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8814 lines...]
Running org.apache.hadoop.mapreduce.v2.app.TestMRApp
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.375 sec - in org.apache.hadoop.mapreduce.v2.app.TestMRApp
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.237 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.024 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.714 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher

Results :

Failed tests: 
  TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned:177->testTaskAttemptAssignedKilledHistory:388 No Ta Started JH Event

Tests run: 340, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.340 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:42 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 26.172 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.708 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [11:03 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13:20 min
[INFO] Finished at: 2016-03-29T19:33:49+00:00
[INFO] Final Memory: 35M/193M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned

Error Message:
No Ta Started JH Event

Stack Trace:
java.lang.AssertionError: No Ta Started JH Event
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.assertTrue(Assert.java:41)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testTaskAttemptAssignedKilledHistory(TestTaskAttempt.java:388)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestTaskAttempt.testMRAppHistoryForTAFailedInAssigned(TestTaskAttempt.java:177)



Hadoop-Mapreduce-trunk-Java8 - Build # 1200 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Mapreduce-trunk-Java8/1200/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 8817 lines...]
Running org.apache.hadoop.mapreduce.v2.app.TestMRApp
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.509 sec - in org.apache.hadoop.mapreduce.v2.app.TestMRApp
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.343 sec - in org.apache.hadoop.mapred.TestTaskAttemptFinishingMonitor
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.042 sec - in org.apache.hadoop.mapred.TestTaskAttemptListenerImpl
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=768m; support was removed in 8.0
Running org.apache.hadoop.mapred.TestLocalContainerLauncher
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.61 sec - in org.apache.hadoop.mapred.TestLocalContainerLauncher

Results :

Failed tests: 
  TestJobImpl.testUnusableNodeTransition:629->assertJobState:1012 expected:<SUCCEEDED> but was:<ERROR>

Tests run: 340, Failures: 1, Errors: 0, Skipped: 0

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop MapReduce Client .................... SUCCESS [  2.234 s]
[INFO] Apache Hadoop MapReduce Core ...................... SUCCESS [01:34 min]
[INFO] Apache Hadoop MapReduce Common .................... SUCCESS [ 23.686 s]
[INFO] Apache Hadoop MapReduce Shuffle ................... SUCCESS [  4.191 s]
[INFO] Apache Hadoop MapReduce App ....................... FAILURE [10:58 min]
[INFO] Apache Hadoop MapReduce HistoryServer ............. SKIPPED
[INFO] Apache Hadoop MapReduce JobClient ................. SKIPPED
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ..... SKIPPED
[INFO] Apache Hadoop MapReduce NativeTask ................ SKIPPED
[INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
[INFO] Apache Hadoop MapReduce ........................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13:04 min
[INFO] Finished at: 2016-03-29T10:32:14+00:00
[INFO] Final Memory: 36M/227M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.17:test (default-test) on project hadoop-mapreduce-client-app: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Mapreduce-trunk-Java8/hadoop-mapreduce-project/hadoop-mapreduce-client/hadoop-mapreduce-client-app/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-mapreduce-client-app
Build step 'Execute shell' marked build as failure
[FINDBUGS] Skipping publisher since build result is FAILURE
Archiving artifacts
Recording test results
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any



###################################################################################
############################## FAILED TESTS (if any) ##############################
1 tests failed.
FAILED:  org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition

Error Message:
expected:<SUCCEEDED> but was:<ERROR>

Stack Trace:
java.lang.AssertionError: expected:<SUCCEEDED> but was:<ERROR>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:144)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.assertJobState(TestJobImpl.java:1012)
	at org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl.testUnusableNodeTransition(TestJobImpl.java:629)